EDIFACT Converter for SAP XI 3.0.

Hello,
I have heard that there is no own or out-of-the-box EDI converter/adapter available for SAP XI 3.0. that has been developed by SAP itself. Is this true?
Do you have an overview over resellers/suppliers who deliver EDI converters and maybe also a comparision of the features and additional license fees that have to be considered?
Thank you very much.

Hi,
<i> have heard that there is no own or out-of-the-box EDI converter/adapter available for SAP XI 3.0. that has been developed by SAP itself. Is this true?</i>
Yes it is true.but we have some EDI mapping converters.like BIC mapper , cantio mapper .. EDI converters..
but not adapters..
See belowlinks for some details about seeburger and others  may helps you..
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/135b0b94-0701-0010-f6a9-86a14057544a
/people/bla.suranyi/blog/2006/06/08/sap-xi-supports-edifact
/people/william.li/blog/2006/03/17/how-to-get-started-using-conversion-agent-from-itemfield
/people/paul.medaille/blog/2005/11/17/more-on-the-sap-conversion-agent-by-itemfield
http://www.stylusstudio.com/edi/XML_to_X12.html
https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/b0b355ae-0501-0010-3b83-8f2bb566fa47
Details on XI EDI adapter from seeburger
Check this for Conversions-
/people/bla.suranyi/blog/2006/06/08/sap-xi-supports-edifact
http://www.seeburger.it/fileadmin/it/pdf/2005_04_sapphire_Ferrero_transcript.pdf
http://www.seeburger.com/fileadmin/com/pdf/Butler_Group_SEEBURGER_Technology_Audit.pdf
http://www.seeburger.com/fileadmin/com/pdf/AS2_General_Overview.pdf
SAP Adapters
EDI with XI
http://www.seeburger.com
Regards
Chilla

Similar Messages

  • Convert PLD to Crystal Report for SAP Business One 9 or higher

    Hi Experts
    Please guide me how to convert PLD to Crystal Report for SAP Business One 9 or higher... I have tried to use B1 Crystal Converter for 8.8, but its not working with SBO 9... So please guid as per SBO 9.
    Thanks in advance...

    Hi,
    Check this thread:
    http://scn.sap.com/thread/3391875
    Thanks & Regards,
    Nagarajan

  • Convert SQL server database into SAP readable (encrypted) XML for SAP tool?

    Could anyone kindly let me know what is the procedure to convert SQL server database into SAP readable (encrypted) XML for SAP Authoring tool???

    So If I understood it correctly there an existing propriertory question bank with SQL server. You are looking at an option to migrate all the tests and questions from the existing system to the LSO system. Right ?
    I am still not clear on the xml conversion. Have you guys found a solution which could be achieved through a xml file ?
    am not aware of a way through which you could import only a xml file and create tests/questions. If you have a sample xml file then forward me so that I could do some testing on my end.As per my knowledge you could do one of the following. I
    1. Create the tests and questions manually in Authoring Environment. It will be a time consuming task. Based on the number of questions you have you might have to assemble a team of content developers to acheive this.
    2. Alternatively, you could create a Adobe Flash based assessment. The Flash component would be the front-end and will read from a xml file to display the questions and to drive the funcationality. This would be a easier and less time consuming than creating the assessments manually in authoring environment. However, you might miss out some of the functionality available in the Test Author of Authoring Environment unless you have all the functionality replicated inside Flash. This would require one time effort in creating the Flash template and the xml file structure. Once that is created you could create multiple assessments by just replacing the xml file. If you select this approach then you would have to ensure the data from SQL is converted in the desired xml format required by your Flash component.
    Please let me know if you require any further guidance or clarification regarding this.
    Regards,
    Ravi Sekhar

  • File content convertion for multi mapping occurance 1:N

    Hi ALL,
    In my scenario i have used multimapping means 1:N,
    In my scenario the sender is file adapter and receiver is JMS adapter.
    My requirement is that i have to pick the file from respective directory path...
    through file content convertion
    For example below is the structure when the sender file adpater pickes the flat file..through File content convertion....
    <?xml version="1.0" encoding="utf-8"?>
    <ns:mt_BookingConfirmation xmlns:ns1="urn:agrp:ml">
         <Recordset>
             <BOOKING_REQCON_ROU>
               <RECORD_TYPE>ROU</RECORD_TYPE>
               <SENDER_ID>AS.MAN</SENDER_ID>
               <RECEIVER_ID>abcd.99018293</RECEIVER_ID>
            </BOOKING_REQCON_ROU>
         </Recordset>
    </ns:mt_BookingConfirmation>
    But the structure of the source which i have in the mpping is as below,you can see the diffrence and this diffrence we will get automatically when we change the occurances of the target structure.
    <?xml version="1.0" encoding="UTF-8"?>
    <ns0:Messages xmlns:ns0="http://sap.com/xi/XI/SplitAndMerge">
       <ns0:Message1>
          <ns1:mt_BookingConfirmation xmlns:ns1="urn:agrp:ml">
            <Recordset>
             <BOOKING_REQCON_ROU>
               <RECORD_TYPE>ROU</RECORD_TYPE>
               <SENDER_ID>AS.MAN</SENDER_ID>
               <RECEIVER_ID>abcd.99018293</RECEIVER_ID>
            </BOOKING_REQCON_ROU>
         </Recordset>
          </ns1:mt_BookingConfirmation>
       </ns0:Message1>
    </ns0:Messages>
    because of this diffrence in the structure... the picked file could not able to pass ttrough from the message mapping.Because of this error is throwned in the SXMB_MONI
    Please if some one know how to do the file content convertion for this type of diffrent structure.
    Please let me know ASAP.
    Thanks in advance...
    Best Regards,
    Aravind.Pujari

    Hi,
    Check Below links.
    File content conversion sites
    /people/venkat.donela/blog/2005/03/02/introduction-to-simplefile-xi-filescenario-and-complete-walk-through-for-starterspart1
    /people/venkat.donela/blog/2005/03/03/introduction-to-simple-file-xi-filescenario-and-complete-walk-through-for-starterspart2
    /people/arpit.seth/blog/2005/06/02/file-receiver-with-content-conversion
    /people/anish.abraham2/blog/2005/06/08/content-conversion-patternrandom-content-in-input-file
    /people/shabarish.vijayakumar/blog/2005/08/17/nab-the-tab-file-adapter
    /people/venkat.donela/blog/2005/03/02/introduction-to-simplefile-xi-filescenario-and-complete-walk-through-for-starterspart1
    /people/venkat.donela/blog/2005/03/03/introduction-to-simple-file-xi-filescenario-and-complete-walk-through-for-starterspart2
    /people/venkat.donela/blog/2005/06/08/how-to-send-a-flat-file-with-various-field-lengths-and-variable-substructures-to-xi-30
    /people/anish.abraham2/blog/2005/06/08/content-conversion-patternrandom-content-in-input-file
    /people/shabarish.vijayakumar/blog/2005/08/17/nab-the-tab-file-adapter
    /people/jeyakumar.muthu2/blog/2005/11/29/file-content-conversion-for-unequal-number-of-columns
    /people/shabarish.vijayakumar/blog/2006/02/27/content-conversion-the-key-field-problem
    /people/michal.krawczyk2/blog/2004/12/15/how-to-send-a-flat-file-with-fixed-lengths-to-xi-30-using-a-central-file-adapter
    /people/arpit.seth/blog/2005/06/02/file-receiver-with-content-conversion
    http://help.sap.com/saphelp_nw04/helpdata/en/d2/bab440c97f3716e10000000a155106/content.htm
    Regards,
    Phani
    Reward Points if Helpful

  • Siemens Meterial for SAP-XI

    Hi,
    Could you please provide me Siemens meterial for SAP-XI
    Thanks,
    RamuV

    hi ramu
         why you are specifically asking for SIEMENS material,as a lot of information is available in sdn blogs...iam sending you the links pls go thru it ...its a  complete xi stuff
    SAP XI supports internal company scenarios and cross-company scenarios.
    XI has following components:
    1) Design and Configuration time components http://help.sap.com/saphelp_nw04/helpdata/en/0d/5ab43b274a960de10000000a114084/frameset.htm
    2) Mappings
    3) Adapters
    http://help.sap.com/saphelp_nw04/helpdata/en/0d/5ab43b274a960de10000000a114084/frameset.htm
    4) CCBPM
    http://help.sap.com/saphelp_nw04/helpdata/en/0d/5ab43b274a960de10000000a114084/frameset.htm
    5) Central monitoring
    http://help.sap.com/saphelp_nw04/helpdata/en/7c/14b5765255e345a9e3f044f1e9bbbf/frameset.htm
    Refer all the below links for starter:
    http://help.sap.com/saphelp_nw04/helpdata/en/e1/8e51341a06084de10000009b38f83b/frameset.htm
    /thread/143337 [original link is broken]
    XI for starters
    Help in XI
    Hi all, I am learning xi, any body have lifecycle document of xi projec
    Regarding XI installation and Learning
    1. Aspirant to learn SAP XI...You won the Jackpot if you read this!-Part I Aspirant to learn SAP XI...You won the Jackpot if you read this!-Part I
    2. Aspirant to learn SAP XI...You won the Jackpot if you read this!-Part II Aspirant to learn SAP XI...You won the Jackpot if you read this!-Part II
    3. Aspirant to learn SAP XI...You won the Jackpot if you read this!-Part III Aspirant to learn SAP XI...You won the Jackpot if you read this!-Part III
    and once you are through ...follow the following links to weblogs which explain all the basic scenarios in XI
    Following are the links to weblogs which will help to develop the basic scenarios.
    Introduction to IDoc-XI-File scenario and complete walk through for starters. - IDoc to File
    ABAP Proxies in XI(Client Proxy) - ABAP Proxy to File
    FILE to JDBC Adapter using SAP XI 3.0 - File to JDBC
    File to R/3 via ABAP Proxy - File to ABAP Proxy
    Introduction to simple(File-XI-File)scenario and complete walk through for starters(Part1) - File to File Part 1
    Introduction to simple (File-XI-File)scenario and complete walk through for starters(Part2) - File to File Part 2
    Convert any flat file to any Idoc-Java Mapping - Any flat file to any Idoc
    RFC Scenario using BPM --Starter Kit - File to RFC
    https://www.sdn.sap.com/irj/sdn/weblogs?blog=/pub/wlg/1685 [original link is broken] - File to Mail
    Dynamic File Name using XI 3.0 SP12 Part - I - Dynamic File Name Part 1
    Dynamic file name(XSLT Mapping with Java Enhancement) using XI 3.0 SP12 Part -II - Dynamic File Name Part 2
    The specified item was not found. - Dynamic Mail Address
    Understanding message flow in XI - Message Flow in XI
    Walkthrough with BPM - Walk through BPM
    Schedule Your BPM - Schedule BPM
    Demonstrating Use of Synchronous-Asynchronous Bridge to Integrate Synchronous and Asynchronous systems using ccBPM in SAP Xi - Use of Synch - Asynch bridge in ccBPM
    https://www.sdn.sap.com/irj/sdn/weblogs?blog=/pub/wlg/1403 [original link is broken] - Use of Synch - Asynch bridge in ccBPM
    The specified item was not found. - Maintain RFC destination centrally
    Triggering e-Mails to Shared folders of SAP IS-U - Triggering Email from folder
    Outbound Idoc's - Work around using "Party"? - Handling different partners for IDoc
    /people/siva.maranani/blog/2005/08/27/modeling-integration-scenario146s-in-xi - Modeling Integration Scenario in XI The specified item was not found. - Testing of integration process
    The specified item was not found. - Authorization in XI http://help.sap.com/saphelp_nw04/helpdata/en/58/d22940cbf2195de10000000a1550b0/content.htm - Authorization in XI
    The specified item was not found. - Alert Configuration
    The specified item was not found. - Trouble shoot alert config
    Executing Unix shell script using Operating System Command in XI - Call UNIX Shell Script
    Overview of Transition from Dev to QA in XI - Transport in XI
    Using ABAP XSLT Extensions for XI Mapping - Using ABAP XSLT Extensions for XI Mapping
    /people/prasad.ulagappan2/blog/2005/06/07/mail-adapter-scenarios-150-sap-exchange-infrastructure - Mail Adaptor options
    IDOCs (Multiple Types) Collection in BPM - Collection of IDoc to Single File
    XI : Controlling access to Sensitive Interfaces - Controlling access to Sensitive Interfaces
    The specified item was not found. - The same filename from a sender to a receiver file adapter - SP14
    Payload Based Message Search in XI30 using Trex Engine - Payload Based Message Search in XI30 using Trex Engine XI : Configuring CCMS Monitoring for XI- Part I - XI : Configuring CCMS Monitoring for XI- Part I
    The specified item was not found. - XI: HTML e-mails from the receiver mail adapter
    XI : FAQ's Provided by SAP (Updated) - XI : FAQ's Provided by SAP thnx chirag
    note:reward points if solution found helpfull.....
    regards
    chandrakanth.k

  • How to make cross Tab report for SAP B1

    Hello and Hi
    i trying to make cross tab report for SAP B1 but test column not appera in
    File >>New >>Cross Tab Report  start cross tab report creation wizard select database name and table
    but only numeric columns appear in available column fields , how will appearall fields including test fields..

    i got solution
    thanks google
    memo type field will not apper in list .use bellow  statment to convert in interger or char type
    convert(int,U_Dist_Sr_No) AS Dist_Sr,            CAST(U_Dist_Name AS varchar(250)) AS Dist_Name,

  • NW 7.3 vs Unicode - Question for SAP

    Hi,
    Is it true that we have to convert to Unicode before we start the upgrade for NW BW 7.3?
    Please confirm.
    Regards,
    SC

    If your system is MDMP yes you will have to, but if your system is non-unicode single code-page system - till date SAP just mentions that a upgrade to SCP non-unicode is possible until SAP NW 7.0 EhP1 but is not recommended. Check this sap note for further details:
    73606 - Supported Languages and Code Pages
    Nothing is yet out for SAP NW 7.3.
    - Regards, Dibya

  • Tips required for SAP XI online training from SAP eductaion partner?

    Hi,
    Recently I have registered for SAP XI course through online from SAP education partner. I would like to have some advice/share some experience to me for taking the online course effectively am a working person and will be able to spend only 2.5 hrs daily. Also please provide some tips for clearing SAP XI certification at the end of the course.
    Your help will be well appreciated.
    Thanks and Regards
    Praveen.

    Hi,
    if u going to give certification in near future please go through the below questions for ur help.
    XI Certification info:
    http://www.sap.com/services/education/certification/certificationtest.epx?context=FFC760B8923D16BB5150DAE63E7C1A6B331AF0B9E3A8F73CE3A9B7046E051044503600C911DBA13DCE978D3AC9057626D2B68111A7CD2D707E2EEC31213097E46EB790DD0106435EE0756F7B22F3FA4B4FF0645C06954BF3A150E023B4164DA282F33B1BD441FBE4083F2C923E33EA0C5960F8C8950FDEB6081ABED6E0E05E2617A693D85077260F9EA218716A79AEF3AA57C7D5E8975334%7cE41BF735AA1862BF88AE1830FD06B8FCD2D8AECBF5109A3E
    refer the below thread and the certification details:
    Re: XI Certifications
    Certification passing mark:
    Re: XI/PI certification
    Refer http://www.sapteched.com/india/reg/certificationExams2.htm for information and eligibility.
    *Note:
    these all questions are framed by us from all the TBITS during our certification exams this is just for ur reference please dont assume that this surely come in Exams but this will help to some extend to see what is ur prepration level.*
    Refer the below thread for more certification help as well:
    TBIT40:
    1) XI is positioned under which if the following
    1) People Integration
    2) Information Integration
    3) Process Integration
    4) Application Platform
    2) SAP XI support which of the following
    a) Sync and Async communication
    b) Sync communication
    c) Async communication
    d) None of the above
    3) SAP XI is an integration technology and platform for
    1) SAP and non SAP application
    2) A2A and B2B scenarios
    3) Async and sync communication
    4) Cross component business process management.
    4) XI uses which of the following web stds for communication with the other systems
    1. WSDL
    2. SOAP
    3. XML
    4. None of the above
    5) Integration builder uses which of the following framework
    1. Client Server framework
    2. Client framework
    3. Server framework
    4. All of the above
    6) Which of the following systems can be integrated with SAP XI
    1. SAP system only
    2. SAP and Non SAP systems
    3. Business Partners and Public Marketplaces
    4. All of the above
    7) SLD uses which framework
    1) Client Server framework
    2) Client framework
    3) Server framework
    4) All of the above
    7) SLD is based on which architecture
    1) Common Interface Model proposed by Distributed management Task force.
    8) Which is equivalent to QoS
    2) BE &#61664; sRFC
    3) EO &#61664; tRFC
    4) EOIO &#61664; qRFC
    9) XI message protocol is based on
    • SOAP envelop with message header or SOAP message with attachment
    • SOAP envelop without message header
    • All of the above
    10) XI transport protocol is based on
    • HTTP or HTTPS
    11) Which of the core services the adapter framework provides
    • Messaging
    • Queuing
    • Security Handling
    • All of the above
    12) Central Monitoring tool provided which of the following services
    • Component monitoring
    • Message monitoring
    • Graphical end to end and performance monitoring
    • All of the above
    13) SLD supports which of the following dimensions.
    • Solution dimension
    • Transport dimension
    • Technical dimension
    • All of the above
    14) Two main areas of SLD are
    • System catalog and Software catalog
    15) Which contains the information about all the software component and products including their versions and all info about third party systems
    • Component landscape
    • Landscape Description
    • Business systems
    • Technical Systems
    16) Which contains the information about the installed landscape element
    • Component landscape
    • Landscape Description
    • Business systems
    • Technical Systems
    17) True or False
    • Technical system may have multiple component installed on it &#61664; True
    • Software product is composed of one or more software components &#61664; True
    • Business system name must be unique for the landscape &#61664; True
    • Technical system can have only one Business system associated with it &#61664; False
    • Software component system can have more than one software component version associated with it &#61664; True
    • Data type can be reused in other structure &#61664; True
    True or false:
    1. XML messaging is used in XI to implement tight coupling.
    2. Integration Engine is a Product to be used at runtime and relies on content of Integration Directory
    3. SLD is an Open architecture, based on CIM
    4. Synchronous communication makes use of Q.S, “Best effort“ ONLY.
    5. Asynchronous communication makes use of Q.S, exactly-once-in-order ONLY
    6. Adapter framework based on Java Connector Architecture (JCA)
    7. Security enrichments for B2B consists of Digital Signature AND Encryption
    8. Partner Connectivity Kit enable partners of XI customers to conduct XML document exchange with XI
    9. Products are collections of Software Components and both Software Products and Components exist in versions.
    10. In the Software Catalog we define each Technical System in the landscape.
    11. The relationship between a software product and its constituent components is described in the SLD by an association called a Software Feature.
    12. SAP XI utilizes two types of content: mention those.
    13. Mention the 3 dimensions of System .
    14. SLD is based on Configuration Information Model (CIM) of the Distributed Management Task Force (DMTF)
    15. SLD is also the basis for SAP Solution Manager
    16. XI Integration is interface-driven.
    17. Message types and Data types are defined as XML Schema (XSD)
    18. The Interface type corresponds to the root of the XML message
    19. The Message Interface is the highest-level representation of XML metadata
    Adapters:
    1) Adapters convert messages from XI protocol (SOAP with attachments over Http) to the respective protocol spoken by the application and vice versa (True / False)
    2) Application can be categorized based on their function
    a) Application Adapter
    b) Technical Adapter
    c) Industry Adapter
    d) Business Adapter
    3) SAP, SIEBEL is not an application adapter (True / False)
    4) File, RFC, JMS, JDBC, MarketPlaces are Technical Adapter (True / False)
    5) Rosettanet & CIDX is Business Adapter (True / False) (Industry Adapter)
    6) Technical Adapter are low level adapter, they just provide gateway to the application system. (True / False)
    7) Plain Http & IDOC adapter resides within the Integration Server (ABAP)
    (True / False)
    8) Adapter framework is the core of Adapter engine. What core services are common to all the adapters?
    a) Messaging,
    b) Queuing,
    c) Security Handling,
    d) Debugging
    9) Adapter framework supports JCA J2EE Connector Architecture (True / False)
    10) Adapter engine can be deployed centrally and not locally (True / False)
    11) Central Adapter engine is manually installed with Integration Sever
    (True / False)
    12) Monitoring of IDOC & Plain Http Adapter is not done thru RWB there are specific ABAP transaction to monitor these adapters
    (True / False)
    13) Adapters are know as Inside out Approach (True / False)
    14) Proxies are known as Outside In approach (True / False)
    15) Communication between PCK & SAP XI is via http messaging protocol
    (True / False)
    16) SAP PCK includes the following adapters FILE, JDBC, JMS SOAP, IDOC (True / False)
    17) Adapter Framework has its own persistence Layer (True / False)
    18) In case of JMS & JDBC vendors specific drivers must be deployed in the SAP J2EE engine in order to function properly (True / False)
    19) Communication between Integration Sever & Adapter Engine is done using SAP XI messaging protocol (True / False)
    20) Since PCK is detached from Integration Central configuration and administration is not possible (True / False)
    BPM:
    1. The pattern that can be used to receive Multiple messages in parallel and send the message in the reverse sequence
    1. Collect
    2. Multicast
    3. Serialization
    4. Sync/Async
    2. The pattern used to send messages to multiple receivers and waiting for a response message from each receiver is known as
    1. Collect
    2. Multicast
    3. Serialization
    4. Sync/Async
    3. A local correlation is used to send messages to multiple receivers so that a separate instance of the correlation can be processes for each receiver.
    True/ False
    4. In serialization you cannot specify that the process must wait for an acknowledgement each time that it sends a message
    True/ False
    5. You should always use multiline container whenever you create a collect pattern.
    True / false
    6. You should use an infinite while loop for Collect patterns of type
    1. Payload dependent
    2. Time dependent
    3. Message dependent
    4. Different interfaces.
    7. The collect pattern of Different interfaces can be done without a Fork step
    True/ false.
    8. if Multiple conditions are specified for a fork statement the conditions are checked in sequence
    True/False
    Refer the below links:
    /people/sravya.talanki2/blog/2006/12/27/aspirant-to-learn-sap-xiyou-won-the-jackpot-if-you-read-this-part-iii
    Mapping techniques:
    https://www.sdn.sap.com/irj/sdn/wiki?path=/display/xi/mapping%2btechniques
    Different types of Mapping in XI:
    /people/ravikumar.allampallam/blog/2005/02/10/different-types-of-mapping-in-xi
    ValueMapping using the Graphical Mapping Tool
    /people/community.user/blog/2007/01/08/valuemapping-using-the-graphical-mapping-tool
    Mapping Templates:
    http://help.sap.com/saphelp_nw04/helpdata/en/79/2835b7848c458bb42cf8de0bcc1ace/frameset.htm
    Message Mappings:
    http://help.sap.com/saphelp_nw04/helpdata/en/49/1ebc6111ea2f45a9946c702b685299/frameset.htm
    Multi-Mappings:
    http://help.sap.com/saphelp_nw04/helpdata/en/79/2835b7848c458bb42cf8de0bcc1ace/frameset.htm
    Standard Functions Most imp topic:
    http://help.sap.com/saphelp_nw04/helpdata/en/43/c4cdfc334824478090739c04c4a249/frameset.htm
    These are some of the mapping related questions which we have frame during my certification time.
    1) Which of the mapping is not available in XI By default
    a. Message Mapping
    b. ABAP Mapping
    c. Java Mapping
    d. XSLT Mapping
    e. None of the above.
    2) Choose the true about types of mapping XI supports
    a. Structure Mapping
    b. Node mapping
    c. Value Mapping
    d. Functional Mapping
    3) During Mapping the attribute or element is already assigned mapping, but mapping is not complete. What is the color of ICON for the element/attribute?
    a. Red
    b. White
    c. Yellow
    d. Green
    4) Which of the following is not generic function?
    a. concat
    b. CopyValue
    c. DateTrans
    d. ifWithoutElse
    5) Which of the following in not true about user defined Simple functions
    a. process individual input filed values for each function call
    b. Expect string as input parameter
    c. Takes the object of GlobalContainer class
    d. Return a string
    6) Which of the following are required to create a user defined function
    a. Name
    b. Description
    c. Argument Count
    d. Label
    7) Which of the following package are by default imported in user defined functions
    a. java.lang
    b. java.io
    c. java.reflect
    d. All the above
    e. None of the above
    8) Mapping trace does not support following function. Choose one
    a. addWarning
    b. addDebugMessage
    c. addDebugInfo
    d. addInfo
    9) In Mapping Trace addWarning function which of the following is true.
    a. Supports trace level 1,2 and 3 .
    b. Supports trace level 1,2 .
    c. Supports trace level 1 .
    d. Supports trace level 1,2 ,3 and 4 .
    10) Which of the following is not true about multi mapping
    a. Multimapping supports m:n transformation
    b. Use Abstract interfaces
    c. Can be implemented without ccBPM
    d. Development is same as Message mapping
    11) Mapping template can be defined for structure mapping for the following structure. Chose the wrong one.
    a. Data type true
    b. Complex types in IDOC and RFC’s true
    c. Complex types in external definitions true
    d. Message types false
    e. Reference types used in multiple templates can be located in any software component. True
    12) Which of the following is true about XSLT/JAVA mapping –
    a. You can create a .jar or .zip in XI
    b. Create new imported archive and import .jar in IR
    c. Upload XSLT/JAVA programs and can modify them within XI
    13) In Standard function category, date format is –
    a. ddMMyy
    b. MM/dd/yyyy
    c. ddmmyyyy
    b. MM/dd/yy
    14) Which of the following are mapping objects –
    a. Imported Archives
    b. Message mapping
    c. Mapping template
    d. Interface mapping
    15) The conditions in Boolean function can be represented in the form of –
    a. Square
    b. Triangle
    c. Circle
    d. Rhombus
    16) Which of the following are Standard functions –
    a. Concat
    b. Date
    c. Substring
    d. FormatNum
    17) How do you define mapping of non-mandatory nodes –
    a. minOccurs > 1
    b. minOccurs = 0
    c. maxOccurs > 1
    Thnx
    Chirag

  • Output forrmats for SAP documents

    Hi,
    I am trying to set up our SAP system such that all SAP documents are sent with HTML attachments.
    I have gone into transaction SCOT and under the 'Output formats for SAP documents' section, I tried to select the HTM option for SAPSCripts/Smart Forms, however there is no such option.
    I can see the HTM option for ABAP List though.
    Does anyone know how I can select the HTML option for SAPScripts?
    Thanks,
    Ruby

    Here is some sample code for achieving manual conversion to the OTF format.
    In case of more than 10 pages:
    - Check the spool request
    - Click Edit -> OTF display -> No. of OTF pages
    Convert SAP Script to text:
    - Display the spool request
    - Then click Goto -> List display
    Automatic conversion to the OTF format:
    tables: tline.
    data:   begin of int_tline1 occurs 100.
                 include structure tline.
    data:   end of int_tline1.
    call function 'OPEN_FORM'
                     device         = 'OTF_MEM'
    after CLOSE_FORM
    call function 'CONVERT_OTF_MEMORY'
         exporting
              format                = 'ASCII'
              max_linewidth         = 132
         tables
               lines                 = int_tline1
         exceptions
              err_max_linewidth        = 1
              err_format                    = 2
              err_conv_not_possible = 3
              others                          = 4.
    write the text file to spool
    loop at int_tline1.
       if int_tline1-tdline = space.
          skip.
       else.
          write:/ int_tline1-tdline.
       endif.
    endloop.
    CONVERT_OTF
    Convert SAP documents (SAPScript) to other types.
    Example:
    CALL FUNCTION "CONVERT_OTF"
           EXPORTING    FORMAT                = "PDF"
           IMPORTING    BIN_FILESIZE          = FILE_LEN
           TABLES       OTF                   = OTFDATA
                        LINES                 = PDFDATA
           EXCEPTIONS   ERR_MAX_LINEWIDTH     = 1                  
                        ERR_FORMAT            = 2
                        ERR_CONV_NOT_POSSIBLE = 3
                        OTHERS                = 4.
    CONVERT_OTFSPOOLJOB_2_PDF  converts a OTF spool to PDF (i.e. Sapscript document)
    SX_OBJECT_CONVERT_OTF_PDF  Conversion From OTF to PDF (SAPScript conversion)
    SX_OBJECT_CONVERT_OTF_PRT  Conversion From OTF to Printer Format (SAPScript conversion)
    SX_OBJECT_CONVERT_OTF_RAW  Conversion From OTF to ASCII (SAPScript conversion)
    regards,
    srinivas
    <b>*reward for useful answers*</b>

  • FAQS FOR SAP BW

    HI,
    friends i need some FAQS for the interviews .
    thanks and regards
    shafeeq ahmed

    Hi Shafeed,
    Here are some questions and answers.
    1)name the two table that providedetail information about data source
    2)how and when can you control whether repeat delta is requested?
    3)how can you improve the performance of aquery
    4)how to prevent duplicate record in at the data target level
    5)what is virtual cube?its significance
    6)diff methods of generic datasource
    7)how to connect anew data target to an existing data flow
    8)what is partition
    9) SAP batch process
    10)how do you improve the info cube design preformance
    12)is there any diff between repair run/repaire request.if yes then please tell me in detail
    13)difference between process chain and infopackage group
    diff between partition/aggregate
    ANS--&#61664;
    1)   Hi Santosh
    Please find some of the answers here...
    For Q 3) Query Performance can be improved by making the Aggregates having all the Chars & KF used in Query.
    Q 5) Virtual Cube : InfoProvider with transaction data that is not stored in the object itself, but which is read directly for analysis and reporting purposes. The relevant data can be from the BI system or from other SAP or non-SAP systems.
    VirtualProviders only allow read access to data.
    Q 6) Diff Methods of Generic datasource using Transaction RSO2 :
    a) Extraction from DB Table or View
    b) Extraction from SAP Query
    c) Extraction by Function Module
    2)  Important BW datasource relevant tables
    ROOSOURCE: Table Header for SAP BW OLTP Sources
    RODELTAM: BW Delta Process
    ROOSFIELD: DataSource Fields
    ROOSGEN: Generated Objects for OLTP Source, Last changed date and who etc.
    3)     For Q 8) i think you mean table partition
    You use partition to improve performance. You can only partiton on 0CALMONTH or 0FISCPER
    4)     1. ROOSOURCE
    6. Generic Extarction using 1.Views 2. Infoset Queries , 3. Function modules
    5)     Hi Santosh
    Pls note down the Q& ANS
    Some of the Real time question.
    Q) Under which menu path is the Test Workbench to be found, including in earlier Releases?
    The menu path is: Tools - ABAP Workbench - Test - Test Workbench.
    Q) I want to delete a BEx query that is in Production system through request. Is anyone aware about it?
    A) Have you tried the RSZDELETE transaction?
    Q) Errors while monitoring process chains.
    A) During data loading. Apart from them, in process chains you add so many process types, for example after loading data into Info Cube, you rollup data into aggregates, now this rolling up of data into aggregates is a process type which you keep after the process type for loading data into Cube. This rolling up into aggregates might fail.
    Another one is after you load data into ODS, you activate ODS data (another process type) this might also fail.
    Q) In Monitor----- Details (Header/Status/Details) à Under Processing (data packet): Everything OK à Context menu of Data Package 1 (1 Records): Everything OK -
    Simulate update. (Here we can debug update rules or transfer rules.)
    SM50 à Program/Mode à Program à Debugging & debug this work process.
    Q) PSA Cleansing.
    A) You know how to edit PSA. I don't think you can delete single records. You have to delete entire PSA data for a request.
    Q) Can we make a datasource to support delta.
    A) If this is a custom (user-defined) datasource you can make the datasource delta enabled. While creating datasource from RSO2, after entering datasource name and pressing create, in the next screen there is one button at the top, which says generic delta. If you want more details about this there is a chapter in Extraction book, it's in last pages u find out.
    Generic delta services: -
    Supports delta extraction for generic extractors according to:
    Time stamp
    Calendar day
    Numeric pointer, such as document number & counter
    Only one of these attributes can be set as a delta attribute.
    Delta extraction is supported for all generic extractors, such as tables/views, SAP Query and function modules
    The delta queue (RSA7) allows you to monitor the current status of the delta attribute
    Q) Workbooks, as a general rule, should be transported with the role.
    Here are a couple of scenarios:
    1. If both the workbook and its role have been previously transported, then the role does not need to be part of the transport.
    2. If the role exists in both dev and the target system but the workbook has never been transported, and then you have a choice of transporting the role (recommended) or just the workbook. If only the workbook is transported, then an additional step will have to be taken after import: Locate the WorkbookID via Table RSRWBINDEXT (in Dev and verify the same exists in the target system) and proceed to manually add it to the role in the target system via Transaction Code PFCG -- ALWAYS use control c/control v copy/paste for manually adding!
    3. If the role does not exist in the target system you should transport both the role and workbook. Keep in mind that a workbook is an object unto itself and has no dependencies on other objects. Thus, you do not receive an error message from the transport of 'just a workbook' -- even though it may not be visible, it will exist (verified via Table RSRWBINDEXT).
    Overall, as a general rule, you should transport roles with workbooks.
    Q) How much time does it take to extract 1 million (10 lackhs) of records into an infocube?
    A. This depends, if you have complex coding in update rules it will take longer time, or else it will take less than 30 minutes.
    Q) What are the five ASAP Methodologies?
    A: Project plan, Business Blue print, Realization, Final preparation & Go-Live - support.
    1. Project Preparation: In this phase, decision makers define clear project objectives and an efficient decision making process (i.e. Discussions with the client, like what are his needs and requirements etc.). Project managers will be involved in this phase (I guess).
    A Project Charter is issued and an implementation strategy is outlined in this phase.
    2. Business Blueprint: It is a detailed documentation of your company's requirements. (i.e. what are the objects we need to develop are modified depending on the client's requirements).
    3. Realization: In this only, the implementation of the project takes place (development of objects etc) and we are involved in the project from here only.
    4. Final Preparation: Final preparation before going live i.e. testing, conducting pre-go-live, end user training etc.
    End user training is given that is in the client site you train them how to work with the new environment, as they are new to the technology.
    5. Go-Live & support: The project has gone live and it is into production. The Project team will be supporting the end users.
    Q) What is landscape of R/3 & what is landscape of BW. Landscape of R/3 not sure.
    Then Landscape of b/w: u have the development system, testing system, production system
    Development system: All the implementation part is done in this sys. (I.e., Analysis of objects developing, modification etc) and from here the objects are transported to the testing system, but before transporting an initial test known as Unit testing (testing of objects) is done in the development sys.
    Testing/Quality system: quality check is done in this system and integration testing is done.
    Production system: All the extraction part takes place in this sys.
    Q) How do you measure the size of infocube?
    A: In no of records.
    Q). Difference between infocube and ODS?
    A: Infocube is structured as star schema (extended) where a fact table is surrounded by different dim table that are linked with DIM'ids. And the data wise, you will have aggregated data in the cubes. No overwrite functionality
    ODS is a flat structure (flat table) with no star schema concept and which will have granular data (detailed level). Overwrite functionality.
    Flat file datasources does not support 0recordmode in extraction.
    x before, -after, n new, a add, d delete, r reverse
    Q) Difference between display attributes and navigational attributes?
    A: Display attribute is one, which is used only for display purpose in the report. Where as navigational attribute is used for drilling down in the report. We don't need to maintain Navigational attribute in the cube as a characteristic (that is the advantage) to drill down.
    Q. SOME DATA IS UPLOADED TWICE INTO INFOCUBE. HOW TO CORRECT IT?
    A: But how is it possible? If you load it manually twice, then you can delete it by requestID.
    Q. CAN U ADD A NEW FIELD AT THE ODS LEVEL?
    Sure you can. ODS is nothing but a table.
    Q. CAN NUMBER OF DATASOURCES HAVE ONE INFOSOURCE?
    A) Yes of course. For example, for loading text and hierarchies we use different data sources but the same InfoSource.
    Q. BRIEF THE DATAFLOW IN BW.
    A) Data flows from transactional system to analytical system (BW). DataSources on the transactional system needs to be replicated on BW side and attached to infosource and update rules respectively.
    Q. CURRENCY CONVERSIONS CAN BE WRITTEN IN UPDATE RULES. WHY NOT IN TRANSFER RULES?
    Q) WHAT IS PROCEDURE TO UPDATE DATA INTO DATA TARGETS?
    FULL and DELTA.
    Q) AS WE USE Sbwnn, sbiw1, sbiw2 for delta update in LIS THEN WHAT IS THE PROCEDURE IN LO-COCKPIT?
    No LIS in LO cockpit. We will have datasources and can be maintained (append fields). Refer white paper on LO-Cockpit extractions.
    Q) Why we delete the setup tables (LBWG) & fill them (OLI*BW)?
    A) Initially we don't delete the setup tables but when we do change in extract structure we go for it. We r changing the extract structure right, that means there are some newly added fields in that which r not before. So to get the required data (i.e.; the data which is required is taken and to avoid redundancy) we delete n then fill the setup tables.
    To refresh the statistical data. The extraction set up reads the dataset that you want to process such as, customers orders with the tables like VBAK, VBAP) & fills the relevant communication structure with the data. The data is stored in cluster tables from where it is read when the initialization is run. It is important that during initialization phase, no one generates or modifies application data, at least until the tables can be set up.
    Q) SIGNIFICANCE of ODS?
    It holds granular data (detailed level).
    Q) WHERE THE PSA DATA IS STORED?
    In PSA table.
    Q) WHAT IS DATA SIZE?
    The volume of data one data target holds (in no. of records)
    Q) Different types of INFOCUBES.
    Basic, Virtual (remote, sap remote and multi)
    Virtual Cube is used for example, if you consider railways reservation all the information has to be updated online. For designing the Virtual cube you have to write the function module that is linking to table, Virtual cube it is like a the structure, when ever the table is updated the virtual cube will fetch the data from table and display report Online... FYI.. you will get the information : https://www.sdn.sap.com/sdn/index.sdn and search for Designing Virtual Cube and you will get a good material designing the Function Module
    Q) INFOSET QUERY.
    Can be made of ODS's and Characteristic InfoObjects with masterdata.
    Q) IF THERE ARE 2 DATASOURCES HOW MANY TRANSFER STRUCTURES ARE THERE.
    In R/3 or in BW? 2 in R/3 and 2 in BW
    Q) ROUTINES?
    Exist in the InfoObject, transfer routines, update routines and start routine
    Q) BRIEF SOME STRUCTURES USED IN BEX.
    Rows and Columns, you can create structures.
    Q) WHAT ARE THE DIFFERENT VARIABLES USED IN BEX?
    Different Variable's are Texts, Formulas, Hierarchies, Hierarchy nodes & Characteristic values.
    Variable Types are
    Manual entry /default value
    Replacement path
    SAP exit
    Customer exit
    Authorization
    Q) HOW MANY LEVELS YOU CAN GO IN REPORTING?
    You can drill down to any level by using Navigational attributes and jump targets.
    Q) WHAT ARE INDEXES?
    Indexes are data base indexes, which help in retrieving data fastly.
    Q) DIFFERENCE BETWEEN 2.1 AND 3.X VERSIONS.
    Help! Refer documentation
    Q) IS IT NESSESARY TO INITIALIZE EACH TIME THE DELTA UPDATE IS USED?
    No.
    Q) WHAT IS THE SIGNIFICANCE OF KPI'S?
    KPI's indicate the performance of a company. These are key figures
    Q) AFTER THE DATA EXTRACTION WHAT IS THE IMAGE POSITION.
    After image (correct me if I am wrong)
    Q) REPORTING AND RESTRICTIONS.
    Help! Refer documentation.
    Q) TOOLS USED FOR PERFORMANCE TUNING.
    ST22, Number ranges, delete indexes before load. Etc
    Q) PROCESS CHAINS: IF U has USED IT THEN HOW WILL U SCHEDULING DATA DAILY.
    There should be some tool to run the job daily (SM37 jobs)
    Q) AUTHORIZATIONS.
    Profile generator
    Q) WEB REPORTING.
    What are you expecting??
    Q) CAN CHARECTERSTIC INFOOBJECT CAN BE INFOPROVIDER.
    Of course
    Q) PROCEDURES OF REPORTING ON MULTICUBES
    Refer help. What are you expecting? MultiCube works on Union condition
    Q) EXPLAIN TRANPSORTATION OF OBJECTS?
    Dev-àQ and Dev-----àP
    Q) What types of partitioning are there for BW?
    There are two Partitioning Performance aspects for BW (Cube & PSA)
    Query Data Retrieval Performance Improvement:
    Partitioning by (say) Date Range improves data retrieval by making best use of database execution plans and indexes (of say Oracle database engine).
    B) Transactional Load Partitioning Improvement:
    Partitioning based on expected load volumes and data element sizes. Improves data loading into PSA and Cubes by infopackages (Eg. without timeouts).
    Q) How can I compare data in R/3 with data in a BW Cube after the daily delta loads? Are there any standard procedures for checking them or matching the number of records?
    A) You can go to R/3 TCode RSA3 and run the extractor. It will give you the number of records extracted. Then go to BW Monitor to check the number of records in the PSA and check to see if it is the same & also in the monitor header tab.
    A) RSA3 is a simple extractor checker program that allows you to rule out extracts problems in R/3. It is simple to use, but only really tells you if the extractor works. Since records that get updated into Cubes/ODS structures are controlled by Update Rules, you will not be able to determine what is in the Cube compared to what is in the R/3 environment. You will need to compare records on a 1:1 basis against records in R/3 transactions for the functional area in question. I would recommend enlisting the help of the end user community to assist since they presumably know the data.
    To use RSA3, go to it and enter the extractor ex: 2LIS_02_HDR. Click execute and you will see the record count, you can also go to display that data. You are not modifying anything so what you do in RSA3 has no effect on data quality afterwards. However, it will not tell you how many records should be expected in BW for a given load. You have that information in the monitor RSMO during and after data loads. From RSMO for a given load you can determine how many records were passed through the transfer rules from R/3, how many targets were updated, and how many records passed through the Update Rules. It also gives you error messages from the PSA.
    Q) Types of Transfer Rules?
    A) Field to Field mapping, Constant, Variable & routine.
    Q) Types of Update Rules?
    A) (Check box), Return table
    Q) Transfer Routine?
    A) Routines, which we write in, transfer rules.
    Q) Update Routine?
    A) Routines, which we write in Update rules
    Q) What is the difference between writing a routine in transfer rules and writing a routine in update rules?
    A) If you are using the same InfoSource to update data in more than one data target its better u write in transfer rules because u can assign one InfoSource to more than one data target & and what ever logic u write in update rules it is specific to particular one data target.
    Q) Routine with Return Table.
    A) Update rules generally only have one return value. However, you can create a routine in the tab strip key figure calculation, by choosing checkbox Return table. The corresponding key figure routine then no longer has a return value, but a return table. You can then generate as many key figure values, as you like from one data record.
    Q) Start routines?
    A) Start routines u can write in both updates rules and transfer rules, suppose you want to restrict (delete) some records based on conditions before getting loaded into data targets, then you can specify this in update rules-start routine.
    Ex: - Delete Data_Package ani ante it will delete a record based on the condition
    Q) X & Y Tables?
    X-table = A table to link material SIDs with SIDs for time-independent navigation attributes.
    Y-table = A table to link material SIDs with SIDS for time-dependent navigation attributes.
    There are four types of sid tables
    X time independent navigational attributes sid tables
    Y time dependent navigational attributes sid tables
    H hierarchy sid tables
    I hierarchy structure sid tables
    Q) Filters & Restricted Key figures (real time example)
    Restricted KF's u can have for an SD cube: billed quantity, billing value, no: of billing documents as RKF's.
    Q) Line-Item Dimension (give me an real time example)
    Line-Item Dimension: Invoice no: or Doc no: is a real time example
    Q) What does the number in the 'Total' column in Transaction RSA7 mean?
    A) The 'Total' column displays the number of LUWs that were written in the delta queue and that have not yet been confirmed. The number includes the LUWs of the last delta request (for repetition of a delta request) and the LUWs for the next delta request. A LUW only disappears from the RSA7 display when it has been transferred to the BW System and a new delta request has been received from the BW System.
    Q) How to know in which table (SAP BW) contains Technical Name / Description and creation data of a particular Reports. Reports that are created using BEx Analyzer.
    A) There is no such table in BW if you want to know such details while you are opening a particular query press properties button you will come to know all the details that you wanted.
    You will find your information about technical names and description about queries in the following tables. Directory of all reports (Table RSRREPDIR) and Directory of the reporting component elements (Table RSZELTDIR) for workbooks and the connections to queries check Where- used list for reports in workbooks (Table RSRWORKBOOK) Titles of Excel Workbooks in InfoCatalog (Table RSRWBINDEXT)
    Q) What is a LUW in the delta queue?
    A) A LUW from the point of view of the delta queue can be an individual document, a group of documents from a collective run or a whole data packet of an application extractor.
    Q) Why does the number in the 'Total' column in the overview screen of Transaction RSA7 differ from the number of data records that is displayed when you call the detail view?
    A) The number on the overview screen corresponds to the total of LUWs (see also first question) that were written to the qRFC queue and that have not yet been confirmed. The detail screen displays the records contained in the LUWs. Both, the records belonging to the previous delta request and the records that do not meet the selection conditions of the preceding delta init requests are filtered out. Thus, only the records that are ready for the next delta request are displayed on the detail screen. In the detail screen of Transaction RSA7, a possibly existing customer exit is not taken into account.
    Q) Why does Transaction RSA7 still display LUWs on the overview screen after successful delta loading?
    A) Only when a new delta has been requested does the source system learn that the previous delta was successfully loaded to the BW System. Then, the LUWs of the previous delta may be confirmed (and also deleted). In the meantime, the LUWs must be kept for a possible delta request repetition. In particular, the number on the overview screen does not change when the first delta was loaded to the BW System.
    Q) Why are selections not taken into account when the delta queue is filled?
    A) Filtering according to selections takes place when the system reads from the delta queue. This is necessary for reasons of performance.
    Q) Why is there a DataSource with '0' records in RSA7 if delta exists and has also been loaded successfully?
    It is most likely that this is a DataSource that does not send delta data to the BW System via the delta queue but directly via the extractor (delta for master data using ALE change pointers). Such a DataSource should not be displayed in RSA7. This error is corrected with BW 2.0B Support Package 11.
    Q) Do the entries in table ROIDOCPRMS have an impact on the performance of the loading procedure from the delta queue?
    A) The impact is limited. If performance problems are related to the loading process from the delta queue, then refer to the application-specific notes (for example in the CO-PA area, in the logistics cockpit area and so on).
    Caution: As of Plug In 2000.2 patch 3 the entries in table ROIDOCPRMS are as effective for the delta queue as for a full update. Please note, however, that LUWs are not split during data loading for consistency reasons. This means that when very large LUWs are written to the DeltaQueue, the actual package size may differ considerably from the MAXSIZE and MAXLINES parameters.
    Q) Why does it take so long to display the data in the delta queue (for example approximately 2 hours)?
    A) With Plug In 2001.1 the display was changed: the user has the option of defining the amount of data to be displayed, to restrict it, to selectively choose the number of a data record, to make a distinction between the 'actual' delta data and the data intended for repetition and so on.
    Q) What is the purpose of function 'Delete data and meta data in a queue' in RSA7? What exactly is deleted?
    A) You should act with extreme caution when you use the deletion function in the delta queue. It is comparable to deleting an InitDelta in the BW System and should preferably be executed there. You do not only delete all data of this DataSource for the affected BW System, but also lose the entire information concerning the delta initialization. Then you can only request new deltas after another delta initialization.
    When you delete the data, the LUWs kept in the qRFC queue for the corresponding target system are confirmed. Physical deletion only takes place in the qRFC outbound queue if there are no more references to the LUWs.
    The deletion function is for example intended for a case where the BW System, from which the delta initialization was originally executed, no longer exists or can no longer be accessed.
    Q) Why does it take so long to delete from the delta queue (for example half a day)?
    A) Import PlugIn 2000.2 patch 3. With this patch the performance during deletion is considerably improved.
    Q) Why is the delta queue not updated when you start the V3 update in the logistics cockpit area?
    A) It is most likely that a delta initialization had not yet run or that the delta initialization was not successful. A successful delta initialization (the corresponding request must have QM status 'green' in the BW System) is a prerequisite for the application data being written in the delta queue.
    Q) What is the relationship between RSA7 and the qRFC monitor (Transaction SMQ1)?
    A) The qRFC monitor basically displays the same data as RSA7. The internal queue name must be used for selection on the initial screen of the qRFC monitor. This is made up of the prefix 'BW, the client and the short name of the DataSource. For DataSources whose name are 19 characters long or shorter, the short name corresponds to the name of the DataSource. For DataSources whose name is longer than 19 characters (for delta-capable DataSources only possible as of PlugIn 2001.1) the short name is assigned in table ROOSSHORTN.
    In the qRFC monitor you cannot distinguish between repeatable and new LUWs. Moreover, the data of a LUW is displayed in an unstructured manner there.
    Q) Why are the data in the delta queue although the V3 update was not started?
    A) Data was posted in background. Then, the records are updated directly in the delta queue (RSA7). This happens in particular during automatic goods receipt posting (MRRS). There is no duplicate transfer of records to the BW system. See Note 417189.
    Q) Why does button 'Repeatable' on the RSA7 data details screen not only show data loaded into BW during the last delta but also data that were newly added, i.e. 'pure' delta records?
    A) Was programmed in a way that the request in repeat mode fetches both actually repeatable (old) data and new data from the source system.
    Q) I loaded several delta inits with various selections. For which one is the delta loaded?
    A) For delta, all selections made via delta inits are summed up. This means, a delta for the 'total' of all delta initializations is loaded.
    Q) How many selections for delta inits are possible in the system?
    A) With simple selections (intervals without complicated join conditions or single values), you can make up to about 100 delta inits. It should not be more.
    With complicated selection conditions, it should be only up to 10-20 delta inits.
    Reason: With many selection conditions that are joined in a complicated way, too many 'where' lines are generated in the generated ABAP source code that may exceed the memory limit.
    Q) I intend to copy the source system, i.e. make a client copy. What will happen with may delta? Should I initialize again after that?
    A) Before you copy a source client or source system, make sure that your deltas have been fetched from the DeltaQueue into BW and that no delta is pending. After the client copy, an inconsistency might occur between BW delta tables and the OLTP delta tables as described in Note 405943. After the client copy, Table ROOSPRMSC will probably be empty in the OLTP since this table is client-independent. After the system copy, the table will contain the entries with the old logical system name that are no longer useful for further delta loading from the new logical system. The delta must be initialized in any case since delta depends on both the BW system and the source system. Even if no dump 'MESSAGE_TYPE_X' occurs in BW when editing or creating an InfoPackage, you should expect that the delta have to be initialized after the copy.
    Q) Is it allowed in Transaction SMQ1 to use the functions for manual control of processes?
    A) Use SMQ1 as an instrument for diagnosis and control only. Make changes to BW queues only after informing the BW Support or only if this is explicitly requested in a note for component 'BC-BW' or 'BW-WHM-SAPI'.
    Q) Despite of the delta request being started after completion of the collective run (V3 update), it does not contain all documents. Only another delta request loads the missing documents into BW. What is the cause for this "splitting"?
    A) The collective run submits the open V2 documents for processing to the task handler, which processes them in one or several parallel update processes in an asynchronous way. For this reason, plan a sufficiently large "safety time window" between the end of the collective run in the source system and the start of the delta request in BW. An alternative solution where this problem does not occur is described in Note 505700.
    Q) Despite my deleting the delta init, LUWs are still written into the DeltaQueue?
    A) In general, delta initializations and deletions of delta inits should always be carried out at a time when no posting takes place. Otherwise, buffer problems may occur: If a user started the internal mode at a time when the delta initialization was still active, he/she posts data into the queue even though the initialization had been deleted in the meantime. This is the case in your system.
    Q) In SMQ1 (qRFC Monitor) I have status 'NOSEND'. In the table TRFCQOUT, some entries have the status 'READY', others 'RECORDED'. ARFCSSTATE is 'READ'. What do these statuses mean? Which values in the field 'Status' mean what and which values are correct and which are alarming? Are the statuses BW-specific or generally valid in qRFC?
    A) Table TRFCQOUT and ARFCSSTATE: Status READ means that the record was read once either in a delta request or in a repetition of the delta request. However, this does not mean that the record has successfully reached the BW yet. The status READY in the TRFCQOUT and RECORDED in the ARFCSSTATE means that the record has been written into the DeltaQueue and will be loaded into the BW with the next delta request or a repetition of a delta. In any case only the statuses READ, READY and RECORDED in both tables are considered to be valid. The status EXECUTED in TRFCQOUT can occur temporarily. It is set before starting a DeltaExtraction for all records with status READ present at that time. The records with status EXECUTED are usually deleted from the queue in packages within a delta request directly after setting the status before extracting a new delta. If you see such records, it means that either a process which is confirming and deleting records which have been loaded into the BW is successfully running at the moment, or, if the records remain in the table for a longer period of time with status EXECUTED, it is likely that there are problems with deleting the records which have already been successfully been loaded into the BW. In this state, no more deltas are loaded into the BW. Every other status is an indicator for an error or an inconsistency. NOSEND in SMQ1 means nothing (see note 378903).
    The value 'U' in field 'NOSEND' of table TRFCQOUT is discomforting.
    Q) The extract structure was changed when the DeltaQueue was empty. Afterwards new delta records were written to the DeltaQueue. When loading the delta into the PSA, it shows that some fields were moved. The same result occurs when the contents of the DeltaQueue are listed via the detail display. Why are the data displayed differently? What can be done?
    Make sure that the change of the extract structure is also reflected in the database and that all servers are synchronized. We recommend to reset the buffers using Transaction $SYNC. If the extract structure change is not communicated synchronously to the server where delta records are being created, the records are written with the old structure until the new structure has been generated. This may have disastrous consequences for the delta.
    When the problem occurs, the delta needs to be re-initialized.
    Q) How and where can I control whether a repeat delta is requested?
    A) Via the status of the last delta in the BW Request Monitor. If the request is RED, the next load will be of type 'Repeat'. If you need to repeat the last load for certain reasons, set the request in the monitor to red manually. For the contents of the repeat see Question 14. Delta requests set to red despite of data being already updated lead to duplicate records in a subsequent repeat, if they have not been deleted from the data targets concerned before.
    Q) As of PI 2003.1, the Logistic Cockpit offers various types of update methods. Which update method is recommended in logistics? According to which criteria should the decision be made? How can I choose an update method in logistics?
    See the recommendation in Note 505700.
    Q) Are there particular recommendations regarding the data volume the DeltaQueue may grow to without facing the danger of a read failure due to memory problems?
    A) There is no strict limit (except for the restricted number range of the 24-digit QCOUNT counter in the LUW management table - which is of no practical importance, however - or the restrictions regarding the volume and number of records in a database table).
    When estimating "smooth" limits, both the number of LUWs is important and the average data volume per LUW. As a rule, we recommend to bundle data (usually documents) already when writing to the DeltaQueue to keep number of LUWs small (partly this can be set in the applications, e.g. in the Logistics Cockpit). The data volume of a single LUW should not be considerably larger than 10% of the memory available to the work process for data extraction (in a 32-bit architecture with a memory volume of about 1GByte per work process, 100 Mbytes per LUW should not be exceeded). That limit is of rather small practical importance as well since a comparable limit already applies when writing to the DeltaQueue. If the limit is observed, correct reading is guaranteed in most cases.
    If the number of LUWs cannot be reduced by bundling application transactions, you should at least make sure that the data are fetched from all connected BWs as quickly as possible. But for other, BW-specific, reasons, the frequency should not be higher than one DeltaRequest per hour.
    To avoid memory problems, a program-internal limit ensures that never more than 1 million LUWs are read and fetched from the database per DeltaRequest. If this limit is reached within a request, the DeltaQueue must be emptied by several successive DeltaRequests. We recommend, however, to try not to reach that limit but trigger the fetching of data from the connected BWs already when the number of LUWs reaches a 5-digit value.
    Q) I would like to display the date the data was uploaded on the report. Usually, we load the transactional data nightly. Is there any easy way to include this information on the report for users? So that they know the validity of the report.
    A) If I understand your requirement correctly, you want to display the date on which data was loaded into the data target from which the report is being executed. If it is so, configure your workbook to display the text elements in the report. This displays the relevance of data field, which is the date on which the data load has taken place.
    Q) Can we filter the fields at Transfer Structure?
    Q) Can we load data directly into infoobject with out extraction is it possible.
    Yes. We can copy from other infoobject if it is same. We load data from PSA if it is already in PSA.
    Q) HOW MANY DAYS CAN WE KEEP THE DATA IN PSA, IF WE R SHEDULED DAILY, WEEKLY AND MONTHLY.
    a) We can set the time.
    Q) HOW CAN U GET THE DATA FROM CLIENT IF U R WORKING ON OFFSHORE PROJECTS. THROUGH WHICH NETWORK.
    a) VPN…………….Virtual Private Network, VPN is nothing but one sort of network where we can connect to the client systems sitting in offshore through RAS (Remote access server).
    Q) HOW CAN U ANALIZE THE PROJECT AT FIRST?
    Prepare Project Plan and Environment
    Define Project Management Standards and
    Procedures
    Define Implementation Standards and Procedures
    Testing & Go-live + supporting.
    Q) THERE is one ODS AND 4 INFOCUBES. WE SEND DATA AT TIME TO ALL CUBES IF ONE CUBE GOT LOCK ERROR. HOW CAN U RECTIFY THE ERROR?
    Go to TCode sm66 then see which one is locked select that pid from there and goto sm12
    TCode then unlock it this is happened when lock errors are occurred when u scheduled.
    Q) Can anybody tell me how to add a navigational attribute in the BEx report in the rows?
    A) Expand dimension under left side panel (that is infocube panel) select than navigational attributes drag and drop under rows panel.
    Q) IF ANY TRASACTION CODE LIKE SMPT OR STMT.
    In current systems (BW 3.0B and R/3 4.6B) these Tcodes don't exist!
    Q) WHAT IS TRANSACTIONAL CUBE?
    A) Transactional InfoCubes differ from standard InfoCubes in that the former have an improved write access performance level. Standard InfoCubes are technically optimized for read-only access and for a comparatively small number of simultaneous accesses. Instead, the transactional InfoCube was developed to meet the demands of SAP Strategic Enterprise Management (SEM), meaning that, data is written to the InfoCube (possibly by several users at the same time) and re-read as soon as possible. Standard Basic cubes are not suitable for this.
    Q) Is there any way to delete cube contents within update rules from an ODS data source? The reason for this would be to delete (or zero out) a cube record in an "Open Order" cube if the open order quantity was 0.
    I've tried using the 0recordmode but that doesn't work. Also, would it
    be easier to write a program that would be run after the load and delete
    the records with a zero open qty?
    A) START routine for update rules u can write ABAP code.
    A) Yap, you can do it. Create a start routine in Update rule.
    It is not "Deleting cube contents with update rules" It is only possible to avoid that some content is updated into the InfoCube using the start routine. Loop at all the records and delete the record that has the condition. "If the open order quantity was 0" You have to think also in before and after images in case of a delta upload. In that case you may delete the change record and keep the old and after the change the wrong information.
    Q) I am not able to access a node in hierarchy directly using variables for reports. When I am using Tcode RSZV it is giving a message that it doesn't exist in BW 3.0 and it is embedded in BEx. Can any one tell me the other options to get the same functionality in BEx?
    A) Tcode RSZV is used in the earlier version of 3.0B only. From 3.0B onwards, it's possible in the Query Designer (BEx) itself. Just right click on the InfoObject for which you want to use as variables and precede further selecting variable type and processing types.
    Q) Wondering how can I get the values, for an example, if I run a report for month range 01/2004 - 10/2004 then monthly value is actually divide by the number of months that I selected. Which variable should I use?
    Q) Why is it every time I switch from Info Provider to InfoObject or from one item to another while in modeling I always get this message " Reading Data " or "constructing workbench" in it runs for minutes.... anyway to stop this?
    Q) Can any one give me info on how the BW delta works also would like to know about 'before image and after image' am currently in a BW project and have to write start routines for delta load.
    Q) I am very new to BW. I would like to clarify a doubt regarding Delta extractor. If I am correct, by using delta extractors the data that has already been scheduled will not be uploaded again. Say for a specific scenario, Sales. Now I have uploaded all the sales order created till yesterday into the cube. Now say I make changes to any of the open record, which was already uploaded. Now what happens when I schedule it again? Will the same record be uploaded again with the changes or will the changes get affected to the previous record.
    A)
    Q) In BW we need to write abap routines. I wish to know when and what type of abap routines we got to write. Also, are these routines written in update rules? I will be glad, if this is clarified with real-time scenarios and few examples?
    A) Over here we write our routines in the start routines in the update rules or in the transfer structure (you can choose between writing them in the start routines or directly behind the different characteristics. In the transfer structure you just click on the yellow triangle behind a characteristic and choose "routine". In the update rules you can choose "start routine" or click on the triangle with the green square behind an individual characteristic. Usually we only use start routine when it does not concern one single characteristic (for example when you have to read the same table for 4 characteristics). I hope this helps.
    We used ABAP Routines for example:
    To convert to Uppercase (transfer structure)
    To convert Values out of a third party tool with different keys into the same keys as our SAP System uses (transfer structure)
    To select only a part of the data for from an infosource updating the InfoCube (Start Routine) etc.
    Q) What is ODS?
    A) An ODS object acts as a storage location for consolidated and cleaned-up transaction data (transaction data or master data, for example) on the document (atomic) level.
    This data can be evaluated using a BEx query.
    Standard ODS Object
    Transactional ODS object:
    The data is immediately available here for reporting. For implementation, compare with the Transactional ODS Object.
    A transactional ODS object differs from a standard ODS object in the way it prepares data. In a standard ODS object, data is stored in different versions ((new) delta, active, (change log) modified), where as a transactional ODS object contains the data in a single version. Therefore, data is stored in precisely the same form in which it was written to the transactional ODS object by the application. In BW, you can use a transaction ODS object as a data target for an analysis process.
    The transactional ODS object is also required by diverse applications, such as SAP Strategic Enterprise Management (SEM) for example, as well as other external applications.
    Transactional ODS objects allow data to be available quickly. The data from this kind of ODS object is accessed transactionally, that is, data is written to the ODS object (possibly by several users at the same time) and reread as soon as possible.
    It offers no replacement for the standard ODS object. Instead, an additio

  • "Java and BAPI Technology for SAP" by by Ken Kroes - Is this book useful?

    Howdy partners,
    I've got a 'book' called "Java and BAPI Technology for SAP" by by Ken Kroes, Anil Thakur, Gareth M. deBruyn, Robert Lyfareff.
    Now, does anyone know if the info in this book is still relevant to the modern SAP world? I mean its still talking about ITS and stuff like that?
    Any input will be appreciated.

    With the blue cover? I have one with the something similar title, but it's quite out of date. There was lot of improvement on the Java side.
    So I would prefer something newer from SAP-PRESS. Check the SAP-PRESS site.
    BTW I don't consider ITS as an old technology...it was just integrated recently into WAS 640 to make it more powerful. This is the only available technology, which converts classic Dynpro to Webpage dynamically.
    Peter

  • What is Ale, Edi,IDoc tech for sap

    Hi all,
    What is Ale, Edi,IDoc tech for sap?
    Thanks.

    hi deniz,
    <b>ALE</b>
    ALE
    Triggering OB via Change Pointers:
          When an application make a changes in any object an entry is made in CDHDR and CDPOS tables. When any change is made the SMD(Shared Master Data) Tool checks that ALE setting and create a change pointer in the BDCP table that will point to CDHDR table. A standard program RBDMIDOC is scheduled to run on a periodic basis to evaluate the change pointers for a message type and start the ALE process to distribute to the appropriate destinations. The program will read the table TBDME to determine the IDoc selection program it is MASTERIDOC_CREATE_SMD_MATMAS in the case of MATMAS.
    Outbound Process:
    When the stand-alone program or the Transaction(BD10) is run, it calls the Idoc
    Selection program MASTERIDOC_CREATE_REQ_MATCOR in the case of MATMAS, which is hard coded in the program. The Idoc Selection program reads the master data and create a master Idoc and stores it in the memory. The function module then calls ALE layer by using another function module MASTER_IDOC_DISTRIBUTE which is invoked inside it. Then Idoc filtering, Segment filtering etc.. are done in the ALE layer.
         To dispatch the Idoc to the destination system, the system reads the partner profile to determine the destination system. Then the sending system calls the function module INBOUND_IDOC_PROCESS asynchronously on the remote system and passes the Idoc via memory buffer.
    Asynchronous Communication:
    The data  that is IDocs and the function module are  stored in ARFCDATA and ARFCSSTATE tables. The function module ARFC_DEST_SHIP will transports data to target System and the program ARFC_EXCUTE will execute the stored function Module. If communication problem occurs RSARFCSE program is automatically scheduled., if successful, entries in the tables ARFCSSTATE and ARFCDATA will be deleted.
    Inbound Process:
    The INBOUND_IDOC_PROCESS program is triggered as a result of RFC from the Sending System, the Idoc to be sent is passed as parameter for the function module
    In the partner profile if Process immediately is selected then RBDAPP01 program is executed. It will read the process code(MATM) from the Partner profile, which in turn invoke the function module IDOC_INPUT_MATMAS for MATMAS
         Function module will call the corresponding SAP transaction using call transaction or uses direct programs to convert the Idoc to Application document .
    In the table TBD51 there will be entries to find which option is used.
    Transactions:
    SALE - IMG ALE Configuration root
    •     WE20 - Manually maintain partner profiles
    •     BD64 - Maintain customer distribution model
    •     BD71 - Distribute customer distribution model
    •     SM59 - Create RFC Destinations
    •     BDM5 - Consistency check (Transaction scenarios)
    •     BD82 – Generate Partner Profiles
    •     BD61 – Activate Change Pointers - Globally
    •     BD50 – Activate Change Pointer for Msg Type
    •     BD52 – Activate change pointer per change.doc object
    •     BD59 – Allocation object type -> IDOC type
    •     BD56 – Maintain IDOC Segment Filters
    •     BD53 – Reduction of Message Types
    •     BD21 – Select Change Pointer
    •     BD87 – Status Monitor for ALE Messages
    •     SALE – Display ALE Customizing     WEDI - ALE IDoc Administration
    •     WE60 - IDoc documentation
    •     SARA - IDoc archiving (Object type IDOC)
    •     WE47 - IDoc status maintenance
    •     WE07 - IDoc statistics
    BALE - ALE Distribution Administration
    •     WE05 - IDoc overview
    •     BD87 - Inbound IDoc reprocessing
    •     BD88 - Outbound IDoc reprocessing
    •     BDM2 - IDoc Trace
    •     BDM7 - IDoc Audit Analysis
    •     BD21 - Create IDocs from change pointers
    •     SM58 - Schedule RFC Failures
    <b>EDI</b>
    EDI(Electronic Data Interchange )
    RFC DESTINATION:
                           A logical name used to identify the remote system on which a function needs to be executed.
    Partner Profile:
          It defines the characteristics of data being exchanged with the business partner.
    Partner Function:
         Role of the partner Eg:Ship-to-party,bill-to-party,etc in EDI and in ALE legacy system or remote SAP system.
    Partner type:
              The type of your business partner .
         EDI        – KU/LI
         ALE     --  LS.
    Archiving:
         -Need to archive the important transmitted document
    -Created in SARA
         -Assigned in WE20.
    PartnerProfile Tips:
    •     RSECHK07- Consistency check for Partner Profile.
    •     If you want to modify partner profile; copy it and then modify. If you can’t modify it , u have  to delete it.
    •     To move partner profile to production system choose Partner ->  Transport
    Port defintion:
         It defines the medium through which data is exchanged between systems.
    •     tRFC    -   Used by ALE
    •     FILe     -   Used by EDI
    •     R/2 system system port  – For R/2 system
    •     Internet port           - to connect with internet applications
    •     It also depends on the  receiving side. If the receiving subsystem accepts tRFC, EDI also can use tRFC.
    Remote Function modules required by Subsystem to send status and inbound idoc to SAP:
    Startrfc                                         -   Program to execute RFC – enabled Fun.mod. in SAP
    EDI_DATA_INCOMING             -   To send data as inbound
    EDI_STATUS_INCOMING         -   To send status to SAP
    RSPARAM               -   To know the gateway service of the SAP system.
    Message Control:
    •     Output controller
    •     A service program for determining the output type
    •     To generate and manage various outputs from an application and control their medium and timing of exchange.
    •     To retransmit the document without duplicating it
    Procedure:
    •     A procedure defines a set of possible outputs for an application
    •     There may be more than one procedure but only one can be active
    •     Requirement field in procedure specifies the condition which the sales docu. should reach,for sending the sales order response.
    Output type:
    Defines the characteristics and attributes of the output.
    Access Sequence :
    Defines  a sequence in which the business rules are checked using condition record of the condition table for proposing output type.
    Sapfans Q & A
    SREL_GET_NEXT_RELATIONS   -
    Function Module that will provide me the number of the document that was created by an IDoc
    RLSUB020
    Program which uses the above function module.
    Upgradation problem
    When the old version program is upgraded in to newer version(4.0b &#61664; 4.6c) the SE38 editor won’t allow u get thru’ the programs normally to get rid of this.
    Solution:
    EDIT > MODIFICATION OPERATIONS > SWITCH OFF ASSISTANT. Hit Enter on the subsequently shown Warnings window. You should now be able to maintain the code as before.
    When transporting the dialog program sometimes the links may be missing in the testing and production area. To overcome just check u have properly save your all includes and other stuff in the same request if not, do so. Other options In ver 4.6c -> Within ABAP editor try utilities/Update navigation index
    this fixes the links
    ALE( Application Link Enabling )
    Disk Mirroring:
    Changes occurring in the database should reflected to another disk that maintains a mirror image of the main disk’s contents.
    Replicas:
    Maintaining redundant data across multiple systems.
    ALE:
      It provides distributed environment to integrate non-SAP systems
      Provides guaranteed delivery of data regardless of network problem to application.
    Logical system :
         The systems involved in distributed environment  are assigned a logical name which uniquely identifies a system in a distributed  environment.
    Data Mapping:
         Conversion of a business document in IDoc format to an EDI format (and vice versa) which is performed in the subsystem.
    Message Types:
    •     Represents a business function
    •     Technical structure is the IDoc type
    •     Or can say an instance of IDoc.
    <b>IDoc  :</b>
         They are containers used to exchange data between any two process.
         It represents an IDoc type and IDoc data
         IDoc type is structure and IDoc data is an instance of it.
    <b>Pls reward if helpful.</b>

  • Testing Tools for SAP & SAP BIW

    Hi Forum ,
    will any body pls let me know what are available testing tools for SAP and SAP BIW? Which tool is popular and mostly used? Does any one provide training for this tools?
    Regards
    Suhel

    Hi,
    For SAP R/3 we use CATT and ECATT tools.
    Both are very useful tools and generally used for automated functional testing.Reduces a large amount of cost and human effort.
    Even you can convert or call external tool scripts like mercury QTP etc from ECATT and run from ECATT itself.Also you can use these tools with solution manager to get awesome effortless testing and test plan management.
    Check my weblogs.
    Part-1:Shows how to create a Parameterize CATT script.
    /people/community.user/blog/2007/02/05/create-and-migrate-catt-to-ecatt-part-1
    Part-2:Shows how to migrate CATT to ECATT script.
    /people/community.user/blog/2007/02/06/create-and-migrate-catt-to-ecatt-part-2
    1./people/community.user/blog/2007/01/02/integrating-ecatt-mercury-qtp-part-1
       This one shows the minimum configuration required for QTP and ECATT integration on client and server side.
    2./people/community.user/blog/2007/01/15/integrating-ecatt-mercury-qtp-part-2
       This one shows the standalone mode.e.g.mercury script is created and uploaded to ecatt and then executed from ecatt itself.Also,It is now possible to call this script from Solution manager.
    3./people/community.user/blog/2007/01/22/integrating-ecatt-mercury-qtp-part-3
       This one shows the integration mode which is most powerful as we can run QTP scripts from ECATT,Upload & download modify existing etc.
    Please reward points.

  • Reg Advice for SAP with RFID..

    Hello Friends,
    I am totaly new for RFID concept. now i want to do some project in SAP with RFID. so can u give me some guidance for SAP with RFID and how to go further for that. i mean what is the way for go further in SAP.. and plz tell me some good links wher i can find some material for RFID and its use with SAP.
    Thanks in Advance.
    Marmik

    Hi marmik,
    I am compiling some of my previous replies to
    queries on similar lines probably would help you....
    1.RFID-What is it's purpose!
    Radio Frequency Identification (RFID) is a
    state-of-the-art technology for extensive
    identification of every type of object.
    It enables rapid and automatic data acquisition
    via radio waves.At its most basic level, Radio Frequency Identification ( RFID ) is a wireless
    link to uniquely identify objects or people.
    RFID is a means of capturing data about an object without using a human to read the data.
    The object of any RFID system is to carry data in suitable transponders, generally known as tags,
    and to retrieve data, by machine-readable means,
    at a suitable time and place to satisfy particular application needs. Data within a tag may provide identification for an item in manufacture,
    goods in transit, a location, the identity of a
    vehicle, an animal or individual.
    Where it will be used mainly.
    RFID is becoming more popular in applications
    where other identification technologies, such as barcodes, are no longer sufficient
    (e.g. logistics, material management,
    industrial automation, services etc.).
    POSSIBLE COMMERCIAL APPLICATIONS OF THIS SYSTEM
    • Electronic Article Surveillance(EAS)
    • logistics
    • Asset Management
    • Waste Management
    • Yard Management
    • Fare Collection
    • Animal Identification
    • Luggage Tracking in Airports
    • Industrial laundries
    • Fugitive Emission Detection
    • Totes/Conveyor
    • Vehicle Identification
    • Automotive Anti-theft
    • Employee Identification
    The list continues.It can be used in man
    Advantages of RFID:
    • Contactless identification
    • Reads through many materials like cardboard,
    wood etc.
    • Unlimited reading and writing of the memory
    • Identification takes less than a second
    • Simultaneous reading of various transponders
    • Resistant to harsh environmental conditions,
    such as extreme temperature, humidity etc.
    • Shape and size of the transponders can be
    adapted as desired
    • Transponders can be completely integrated
    into a product
    •High security due to copy and password
    protection or encrypted data communication
    RFID Adoption (RFID Now)AND Growth Prospects
    RFID DEVICES are beginning to replace
    magnetic-stripe security cards for unlocking
    doors and granting access to secured areas—
    especially at facilities with special security
    needs, such as military installations.
    The most visible use of RFID, though, is
    probably the automatic toll payment systems
    that rely on readers at toll plazas to scan
    tags attached to the windshields of passing
    cars. The reader records the tag’s ID and then
    deducts money from a prepaid account.
    These systems are designed to allow cars to
    zip through toll plazas ideally without stopping
    or even slowing down very much .Known as
    E- ZPass in New York, New Jersey, Delaware and
    other states, as FasTrak in California, and by
    different names elsewhere, RFID-based automatic
    toll systems have been operating for several years.
    RFID Growth Prospects(RFID FUTURE)
    Why the RFID market is set to expand rapidly
    1.Auto-ID Center at MIT produced standards: EPC,
    Air-protocol, Savant, ONS.
    2.Cost of tags is dropping rapidly
    3.The coming RFID tsunami is real and it has
    two mega-sized supply buyers written all over it:
    Wal-Mart and the DoD.
    Walmart, the worlds largest retailer, and the
    US Dept of Defense (DOD), are two powerful
    entities that have launched initiatives to
    implement RFID in their logistics operations,
    motivated by their needs to cut costs and
    streamline their supply chain in their
    organizations.Wal-Mart has mandated that
    its top 100 suppliers adopt RFID for palettes
    and cases by January 2005.
    4.In the coming years, Radio Frequency
    Identification (RFID) could have a major
    impact on any enterprise involved in the
    production, movement, or sale of physical
    goods. RFID's unique attributes will enable
    new applications and radical changes to business practices. RFID technology is a powerful
    Auto ID solution that can significantly
    increase efficiency and productivity.
    Implementing an RFID solution requires a
    thorough understanding of the capabilities and limitations of this emerging technology.
    So the future is absolutely very bright
    SAP RFID Solution Package
    The SAP RFID solution package provides a
    complete auto-ID middleware solution connecting
    RFID data directly from auto-ID data capture
    sources, such as RFID readers and device
    controllers, and integrates the data directly
    into enterprise applications . It converts
    raw RFID data into  business process
    information by making the associations from
    key business rules, master data, and transactions
    to raw RFID data .
    SAP-RFID
    SAP’s RFID solution is packaged as a standalone
    solution package built on top of SAP Web
    Application Server 6.40, and consists of:
    • SAP Auto-ID Infrastructure
    • SAP Event Management
    • SAP Enterprise Portal
    • Backend ERP adapters to SAP R/3 4.6C
    and greater
    SAP Auto-ID Infrastructure
    Auto-ID Infrastructure is compatible with
    SAP R/3 4.6C or greater, and it also has
    adapters for SAP Warehouse Management, SAP
    Business Information Warehouse, SAP Supply
    Chain Event Management, SAP Advanced Planning and Optimization, and all SAP NetWeaver components.
    Auto-ID Infrastructure is designed to synchronize
    EPC and RFID information into your ERP systems by providing a complete, multidimensional data model
    of RFID-enabled assets, including:
    1.EPC number
    2.Asset location
    3.Status
    4.Last RFID read point
    5.Handling unit number
    6.Packing hierarchy association
    (such as item, item-to-case, or case-to-pallet)
    It then maps this data to your SAP R/3 system
    via association to handling units (HUs). Handling
    unit headers include a standards-based HU number, weight, volume, and dimensions. HU content
    includes material number, quantity, unit of
    measure, batch number, plant number and location,
    and stock category (such as consignment), as well
    as serial number and inspection lot data.
    These HUs are then associated with EPC numbers
    and are integrated into the appropriate transaction
    for processes such as production orders, transfer orders, sales orders, delivery (ASNs) and receipt, shipment, picking and packing, and inventory move
    and packing operations.
    Because it is built on SAP Web
    Application Server,
    Auto-ID Infrastructure supports multiple
    communication interfaces (HTTP, SOAP, WSDL,
    SOCKET/ TCPIP), as well as mobile interfaces
    to PocketPC 2003 and Mobile Linux (Sharp).
    It runs on Windows (XP, 2000, NT), Unix, and
    Linux, and supports Oracle, SAP DB, DB2,
    and Microsoft SQL Server
    Now if you want to have some more technical
    information on RFID ,u can refer the below
    mentioned sites
    1.     www.aimglobal.org
    2.     www.rfidjournal.com
    3.     www.rfid-handbook.com
    4.     www.sciam.com
    5.     www.tagss.com
    6.     www.dynasis.com
    7.     www.copytag.com
    8.     www.ean-int.org.
    And if you want to know more about SAP-RFID
    you will get lot of information on SDN
    as also rightly mentioned by ARUL.
    1.http://service.sap.com/scm
    2.http://help.sap.com/autoid20/helpdata/en/2d/4ac13f2ad10228e10000000a114084/frameset.htm-this
    a help for SAP AII.
    Regards,
    Pawan P. Khilari
    Message was edited by: Pawan Khilari
    Message was edited by: Pawan Khilari

  • Row Level Security not working for SAP R/3

    Hi Guys
    We have an environment where the details are as mentioned below:
    1. Crystal Reports are created using Open SQL driver to extract data from SAP R/3 using the SAP Integration Kit.
    2. The SAP roles are imported in Business Objects CMC.
    3. Crystal Reports are published on the Enterprise as well.
    3. Authorization objects are created in SAP R/3 and added as required for the row level security as mentioned in the SAP Installation guide as well. The aim is when the user logs into the Infoview and refreshes the report he should only see data that he is meant to so through the authorization objects.The data security works very much fine when the reports are designed directly on the table but when the reports are built on the Business View it doesnt work hence the user is able to see all data.
    Any help in this issue is greatly appreciated.
    Thanks and Regards
    Kamal

    Hi,
    In order for row level security to work for you using the OpenSql driver, you need to configure the Security Definition Editor on your SAP server.  This is a server side tool which the Integration solution for SAP offers as a transport.
    This tool defined which tables are to be restricted based on authorizations.
    However since you are seeing the issue on reports based on Business Views, you need to identify whether the Business View is configured in such a way where the user refreshing the report is based on the user logging into Infoview.  If the connection to your SAP server is always established with the same user when BV is used then you security definition is pointless.
    You can confirm this by tracing your SAP server to identify what user is being used to logon to SAP to refresh the reports.
    thanks
    Mike

Maybe you are looking for

  • Sales order values not coming in copa report can been seen in ke24

    hi i have did sd and billing and actual settlement through va88 i can able to values in ke24 actual line items. but when i cant able to see values in ke30 copa report aganist sales order characteristic and aganist record type A values are not flowing

  • How to read data in slots in JDBC

    I am try to make a join on couple of tables. Now i fetch data that is large and I need to break my sql so that it picks data in slots ie if i have 20k records , i will execute sql to fetch first 5k then next 5k and so on. i tried to do it with rounum

  • Replacing The Hard Drive, How To Install The OS Again?

    Hi all again! Sorry for posting so many questions. I have a macbook pro late 2011 model. I want to replace the hard drive with an SSD drive but I am not too familiar with Mac products. My question is... do I just take out the old hard drive and put t

  • Can't install Suffit(s)

    I have A Cube & a G4 Tower. I can bring up either one in OS 9.1, OS 10.2.8, or OS 10.3.9. Stuffit 7.0.3 is available for OS 9.1, but I've had trouble unstuffing with the OS X systems. I've downloaded Stuffit 8.0.2 (For OS 10.2.8) and Stuffit 10.0.2 (

  • Is there any way I can write/publish an ibook to the iBook app?

    I'm not sure if it's possible but if it is then I would like to try and publish a book, if anyone knows how to then please reply to this.