Simple scenario

Can somebody explain a simple scenario (like 2 or 3 dimensions and 2 or 3 key figures) starting from data load (from flat file, so that I can create using EXCEL) to creating infocube and generating a report.
  Or is there any source that I can find the above scenario?
Thanks in advance.

Hi,
You can do this one.
Create one ODS with following key fields.
1. Document number
Data fields.
2. Cal day
3. Comp code
4. Sales org
5. Customer code
6. CUstomer district
7. Material
8. Quantity
9. Price
And then create a cube with the following dimensions
Organization with fllowing chars
COmp COde
Sales Org
Customer with following chars
Customer Code
Customer Dist
MAterial with following chars
Material
Here, you need not consider the document number. The quantity and price goes to the fact table.
Please let me know if it worked
Sriram

Similar Messages

  • XI Simple Scenario..

    HI Everyone,
    I am new to XI , so please tell me a simple scenario to start with ( if possible with Step-by- Step).
    Also, Please tell me how to move from simple scenarios to more complex ones also if possible give me the complex sceanrios too.
    Regards,
    Snehal

    Snehal,
    Doing a file to file is the best way to start with XI. these blogs can help you on that,
    /people/venkat.donela/blog/2005/03/02/introduction-to-simplefile-xi-filescenario-and-complete-walk-through-for-starterspart1 - File to File Part 1
    /people/venkat.donela/blog/2005/03/03/introduction-to-simple-file-xi-filescenario-and-complete-walk-through-for-starterspart2 - File to File Part 2
    Regards,
    Bhavesh

  • Performance In Simple Scenarios

    I have done some performance testing to see if asynchronous triggers performs any better than synchronous triggers in a simple audit scenario -- capturing record snapshots at insert, update and delete events to a separate database within the same instance of SQL Server.
    Synchronous triggers performed 50% better than asynchronous triggers; this was with conversation reuse and the receive queue activation turned off, so the poor performance was just in the act of forming and sending the message, not receiving and processing.  This was not necessarily surprising to me, and yet I have to wonder under what conditions would we see real performance benefits for audit scenarios.
    I am interested if anyone has done similar testing, and if they received similar or different results.  If anyone had conditions where asynchronous triggers pulled ahead for audit scenarios, I would really like to hear back from them.  I invite any comments or suggestions for better performance.
    The asynchronous trigger:
    Code Snippet
    ALTER TRIGGER TR_CUSTOMER_INSERT ON DBO.CUSTOMER
    FOR INSERT AS
    BEGIN
      DECLARE
        @CONVERSATION UNIQUEIDENTIFIER ,
        @MESSAGE XML ,
        @LOG_OPERATION CHAR(1) ,
        @LOG_USER VARCHAR(35) ,
        @LOG_DATE DATETIME;
      SELECT TOP(1)
        @CONVERSATION = CONVERSATION_HANDLE ,
        @LOG_OPERATION = 'I' ,
        @LOG_USER = USER() ,
        @LOG_DATE = GETDATE()
      FROM SYS.CONVERSATION_ENDPOINTS;
      SET @MESSAGE =
      ( SELECT
          CUST_ID = NEW.CUST_ID ,
          CUST_DESCR = NEW.CUST_DESCR ,
          CUST_ADDRESS = NEW.CUST_ADDRESS ,
          LOG_OPERATION = @LOG_OPERATION ,
          LOG_USER = @LOG_USER ,
          LOG_DATE = @LOG_DATE
        FROM INSERTED NEW
        FOR XML AUTO );
      SEND ON CONVERSATION @CONVERSATION
        MESSAGE TYPE CUSTOMER_LOG_MESSAGE ( @MESSAGE );
    END;
    The synchronous trigger:
    Code Snippet
    ALTER TRIGGER TR_CUSTOMER_INSERT ON DBO.CUSTOMER
    FOR INSERT AS
    BEGIN
      DECLARE
        @LOG_OPERATION CHAR(1) ,
        @LOG_USER VARCHAR(15) ,
        @LOG_DATE DATETIME;
      SELECT
        @LOG_OPERATION = 'I' ,
        @LOG_USER = USER() ,
        @LOG_DATE = GETDATE()
      INSERT INTO SALES_LOG.DBO.CUSTOMER
      SELECT
        CUST_ID = NEW.CUST_ID ,
        CUST_DESCR = NEW.CUST_DESCR ,
        CUST_ADDRESS = NEW.CUST_ADDRESS ,
        LOG_OPERATION = @LOG_OPERATION ,
        LOG_USER = @LOG_USER ,
        LOG_DATE = @LOG_DATE
      FROM INSERTED NEW
    END;

    Synchronous audit has to do one database write (one insert). Asynchronous audit has to do at least an insert and an update (the SEND)  plus a delete (the RECEIVE) and an insert (the audit itself), so that is 4 database writes. If the destination audit service is remote, then the sys.transmission_queue operations have to be added (one insert and one delete). So clearly there is no way asynchronous audit can be on pair with synchronous audit, there are at least 3 more writes to complete. And that is neglecting all the reads (like looking up the conversation handle etc) and all the marshaling/unmarshaling of the message (usually some fairly expensive XML processing).
    Within one database the asynchronous pattern is apealing when the trigger processing is expensive (so that the extra cost of going async is negligible) and reducing the original call response time is important. It could also help if the audit operations create high contention and defering the audit reduces this. Some more esoteric reasons is when asynchronous processing is desired for architecture reasons, like the posibility to add a workflow triggered by the original operation and desire to change this workflow on-the-fly without impact/down time (eg. more consumers of the async message are added, the message is schreded/dispatched to more processing apps and triggers more messages downstream etc etc).
    If the audit is between different databases even within same instance then the problem of availabilty arrises (audit table/database may be down for intervals, blocking the orginal operations/application).
    If the audit is remote (different SQL Server isntances) then using Service Broker solves the most difficult problem (communication) in addition to asynchronicity and availability, in that case the the synchrnous pattern (e.g. using a linked server) is really a bad choice.

  • File content conversion simple scenario

    Hi friends,
    I am trying FCC for first time. I had created one simple file scenario without using FCC to which I had to provide input in XML format.
    like this:
    <?xml version="1.0" encoding="UTF-8"?>
    <ns0:mt_sender xmlns:ns0="http:/soni.com">
       <ponum>1</ponum>
       <poqty>2</poqty>
       <poamt>3</poamt>
    </ns0:mt_sender>
    Now I want to use FCC for this,in which i have to provide comma seperate data, e.g 11,12,114.4
    Please let me know how to do this. I had gone through some blogs on sdn but I didnt got solution for this.
    Thanks.,
    Brij........

    HI,
    If you want the output file in comma separated pattern for the incoming xml file,then you can go for the below mentioned fcc parameters in your receiver channel.
    Structure*                 Recordset,Record
    Record.fieldSeparator             ,
    Record.endSeparator             'nl'
    Recordset.fieldSeparator        'nl'
    Regards,
    Swetha.

  • UDDI - Simple Scenarios

    hi UDDI experts,
    For the SAP online documentation shown below, is there any tutorial or examples or how to guide available? Did anyone have experiene using SAP J2EE Engines UDDI registry?
    Thanks in advance
    Kiran
    snippet from online documentation
    ==============================
    Simple UDDI Scenarios
    ·        One company publishes a standard in UDDI and the others can find and use the same standard.
    For example, a tour office company may offer rooms from several hotels. This company publishes a Web Service Interface in UDDI and each hotel that wants to be represented by this company implements a Web service that extends this Web Service Interface. Later on, after an agreement between both companies, the tour company may use the WSDL of the Hotel Web service to ask for free rooms, book rooms, and so on.
    ·        A published standard Web Service Interface already exists and you want to see who supports it and to choose the one that suits you best.

    Synchronous audit has to do one database write (one insert). Asynchronous audit has to do at least an insert and an update (the SEND)  plus a delete (the RECEIVE) and an insert (the audit itself), so that is 4 database writes. If the destination audit service is remote, then the sys.transmission_queue operations have to be added (one insert and one delete). So clearly there is no way asynchronous audit can be on pair with synchronous audit, there are at least 3 more writes to complete. And that is neglecting all the reads (like looking up the conversation handle etc) and all the marshaling/unmarshaling of the message (usually some fairly expensive XML processing).
    Within one database the asynchronous pattern is apealing when the trigger processing is expensive (so that the extra cost of going async is negligible) and reducing the original call response time is important. It could also help if the audit operations create high contention and defering the audit reduces this. Some more esoteric reasons is when asynchronous processing is desired for architecture reasons, like the posibility to add a workflow triggered by the original operation and desire to change this workflow on-the-fly without impact/down time (eg. more consumers of the async message are added, the message is schreded/dispatched to more processing apps and triggers more messages downstream etc etc).
    If the audit is between different databases even within same instance then the problem of availabilty arrises (audit table/database may be down for intervals, blocking the orginal operations/application).
    If the audit is remote (different SQL Server isntances) then using Service Broker solves the most difficult problem (communication) in addition to asynchronicity and availability, in that case the the synchrnous pattern (e.g. using a linked server) is really a bad choice.

  • A very simple scenario, but I can't find a solution

    Hi,
    I have an iPhone that runs IOS 4.1. It is synced with iTunes library at home which has all my music, video, podcasts, apps, etc. Occasionally when I'm at work I come across an audio clip or video clip that I would like to save on an iPhone to watch during the commute back. I am reasonably tech savvy, but I can't find a solution to accomplish this task. Since iPhone is synced with the library at home, it cannot be synced with the iTunes library at work without it being wiped and reinitialized. I have found partial solution for video files using the (now unavailable) VLC app, which allows transfer of FLV or MP4 files, although since hardware acceleration is not available to VLC app, watching any video above 480p is not possible. VLC app does not accept MP3 files, so I'm out of luck with these.
    In the past I used non-IOS iPods and had no problems syncing it with library at home and being able to manually add music/videos from iTunes library at work.
    I'm surprised that the device/ecosystem that is so user friendly suddenly cannot be used for such a simple task. All of the content I am talking about is DRM free and in open formats like MP3.
    This limitation of only one iTunes library being allowed to add/manage content on an IOS device seems very not user friendly to me. I understand the need to protect content, but again - I am only talking about non-purchased publicly available content like MP3 files of programs published by radio stations.
    I have a VPN connection between home and work, but iTunes does not allow syncing via anything but USB. I have even tried to find some sort of "Virtual USB" port that would allow me to connect iPhone at work and have computer at home see it as USB connected device, but could not find any solution for this on OS X.
    Am I missing something trivial or is it impossible to add an MP3 file to an iPhone from anywhere but one iTunes library?
    Thank you.

    I have finally been able to solve this problem, albeit through an unorthodox method.
    I have placed my iTunes library folder on a NAS I have at home and configured both home Mac Pro and Macbook Pro at work to use the same library. Since I have VPN, the file path is the same for both machines, so they both now can access the same library and sync to the phone from it.
    Now when at work I come across something I want to drop to the iPhone for listening on my way back home, I add the file to the library (it travels over the VPN to my home NAS) and then sync (it travels back over the same VPN to my iPhone)
    Since there's about 20mbit bandwidth, the speed and response is acceptable.
    The problem is solved.
    I wonder, why Apple had to put this strange limitation of only one library being able to sync to the iPhone? Is the rationale behind this decision documented anywhere?

  • JAVA API AND ABAP API SIMPLE SCENARIO

    Hello MDM gurus
    I have never used any Java API or ABAP API to leverage and present MDM functionalities on front end systems like portal,etc...
    Could you please give me all the required to play around with JAVA api and ABAP api's.
    Points will be given to every valuable answer.
    Thanks

    Hi Nazeer,
    In order to use Portal you need Java APIs and to start with refer the MDM Java docs to get the basic idea of various classes and methods to be used in developing the simple java application and access it using portal.
    http://help.sap.com/saphelp_mdm550/helpdata/en/47/9f23e5cf9e3c5ce10000000a421937/frameset.htm
    Sample code for Duplicating Repository
    public class TestDuplicateRepository
               public static ConnectionPool simpleConnection;
               public static RepositoryIdentifier repIdentifier,repIdentifier1;
         public static String session;
         public static String connection = "MDMServer_Test";
         public static String repository1 = "Test_Repository";
         public static String repository2 = "Test_Duplicate";
              public static DBMSType dbmsType = DBMSType.MS_SQL;
         public static void main(String[] args)throws CommandException, ConnectionException
                   //Creating Connection.
                   simpleConnection = ConnectionPoolFactory.getInstance(connection);     
                   //Establishing connection with Repository.
                   repIdentifier = new RepositoryIdentifier(repository1, connection, dbmsType);
                   repIdentifier1 = new RepositoryIdentifier(repository2, connection, dbmsType);
                   //Creation Sever Session.
                   CreateServerSessionCommand createServerSessionCmd = new CreateServerSessionCommand(simpleConnection);
                   createServerSessionCmd.execute();
                   session = createServerSessionCmd.getSession();
                   //Authenticating Server Session.                    
                   AuthenticateServerSessionCommand auth= new AuthenticateServerSessionCommand(simpleConnection);
                   auth.setSession(session);
                   auth.setUserName("Admin");
                   auth.setUserPassword("Admin");
                   auth.execute();
                   session = auth.getSession();     
                   //Duplicate Repository Command
                   DuplicateRepositoryCommand duplRepCmd = new DuplicateRepositoryCommand(simpleConnection);
                   duplRepCmd.setDBMSUserName("sa");
                   duplRepCmd.setDBMSUserPassword("abc");
                   duplRepCmd.setSession(session);
                   duplRepCmd.setSourceRepositoryIdentifier(repIdentifier);
                   duplRepCmd.setTargetRepositoryIdentifier(repIdentifier1);
                   duplRepCmd.execute();
    Similarly you can try with Getting server version, Archive repository and then move on to adding,modifying records etc.
    For ABAP APIs refer the below link
    http://help.sap.com/saphelp_mdm550/helpdata/en/44/93aa6e31381053e10000000a422035/frameset.htm
    Regards,
    Jitesh Talreja

  • Multi Mapping   for a simple scenario

    Hi,
       i have   a scenario    i.e. from the source I'm getting some 10 fields of data..   like  bellow..
    Data:                      0--- Unbounded
        Company_Code   1-1
        Order_No            1-1
        Material              1-1
        Amount              1-1
    but my requirement is.. in the receiver side i have two structures.. 
    1)Receiver1
    Data:                      0--- Unbounded
        Company_Code   1-1
        Order_No            1-1
    2)Receiver2
    Data:                      0--- Unbounded
        Company_Code   1-1
        Material              1-1
        Amount              1-1
    if the Company Code is  1000  then   the data wil goes to First Receiver..   and  if the  Company Code is 2000  then the data will goes to Second Receiver
      This is my requirement..
      for this.  idid IR point of view every thing correctly by using multi  mapping..  even if i test the mapping that is working fine..
    but  in the ID(Integration Directory ) i'm not able to see any  Interface mappings .  in the Enhanced  Interface detremination..    It was displaying No Objects Found.. message..
    for that what  can i do..   any suggestions.. please..
    regards
    Jain

    Hi jain,
      I think you confused with  occurance.
      For your recurement You created a structure with 0-unbounded in the Source and Receivers.
      There is no need of change the Message type and Interface Name Structure occurances.
      Just Do like belllow.
      1. Mapping:
         select your source structure.
         select your Receiver structures.
         Put the condition for creating Nodes for the Receiver structures.
         There is No need of change occurances.
      2.InterFace Mapping:
         Select Your source message Interface.
         Select your Receiver interfaces.
         Have (1 Mapping) 1 source ,2 Receivers.
         There is No need of change occurances.
    Now go to Interface Determination.Choose extened.you will get your mapping.
    Regards,
    Prakasu

  • XI and BAPI - simple scenario

    Hi,
    I'm just a beginner with XI. I've got such scenario to develop. Could anyone help me ?
    1. After completing production in external system I'm receving LOIPRO.LOIPRO01 message from R/3 system with status I0045. (I mean LOIPRO with status I0045 is beeing sent from external system to XI)
    2. In that case I should call BAPI_GOODSMVT_CREATE with quantity 0 and delivery flag completed.
    3. To call that BAPI, IDOC sent from external system has unsufficient data (only status and process order). Because of that before BAPI_GOODSMVT_CREATE the BAPI_PROCORD_GET_DETAIL must be called.
    Should I use BPM or I can avoid using it ?
    Could anyone describe me how should it looks like in Intergration Repository (I would be very grateful for any step-by-step description )
    Thanks in advance.

    Hi Lukasz,
    While your scenario could be accomplished using a BPM, this would not be optimal. One reason is the performance overhead incurred and another is that the information received from BAPI_PROCORD_GET_DETAIL could (in principle) be stale by the time you later call BAPI_GOODSMVT_CREATE because there is no way to call the 2 BAPIs from XI in the same transactional context.
    A better solution would be to wrap the 2 BAPI calls in either a remote-enabled function module or, better yet, a receiver proxy in the R/3 system. Thus, XI simply calls this RFC or proxy and it, in turn, takes care of calling the 2 BAPIs in sequence.
    Regards,
    Thorsten

  • GET VPN in a simple scenario

    R1---Cloud(R4)----R2
              |
              R3(KS)
    hi,
    I set up 3 routers, with R3 being the KS. a very simple GET VPN. It is not working. The underlying reachibility is fine.
    any idea?
    thanks,
    Han
    =====R3, KS====
    crypto isakmp policy 10
    encr aes
    authentication pre-share
    group 2
    crypto isakmp key cisco address 1.1.14.1
    crypto isakmp key cisco address 1.1.24.2
    crypto ipsec transform-set mygdoi-trans esp-aes esp-sha-hmac
    crypto ipsec profile godi-profile-getvpn
    set security-association lifetime seconds 7200
    set transform-set mygdoi-trans
    crypto gdoi group getvpn
    identity number 1234
    server local
      rekey retransmit 10 number 2
      sa ipsec 1
       profile godi-profile-getvpn
       match address ipv4 199
       replay counter window-size 64
    interface Serial1/0
    ip address 1.1.34.3 255.255.255.0
    serial restart-delay 0
    router ospf 1
    log-adjacency-changes
    network 0.0.0.0 255.255.255.255 area 0
    ip forward-protocol nd
    no ip http server
    no ip http secure-server
    access-list 199 permit ip host 1.1.1.1 host 2.2.2.2
    access-list 199 permit ip host 2.2.2.2 host 1.1.1.1
    ============R1, GM============
    crypto isakmp policy 10
    encr aes
    authentication pre-share
    group 2
    lifetime 1200
    crypto isakmp key cisco address 1.1.34.3
    crypto gdoi group getvpn
    identity number 1234
    server address ipv4 1.1.34.3
    crypto map getvpn-map 10 gdoi
    set group getvpn
    interface Loopback0
    ip address 1.1.1.1 255.255.255.0
    interface FastEthernet0/0
    no ip address
    shutdown
    duplex half
    interface Serial1/0
    ip address 1.1.14.1 255.255.255.0
    serial restart-delay 0
    crypto map getvpn-map
    router ospf 1
    log-adjacency-changes
    network 0.0.0.0 255.255.255.255 area 0
    =====R2, GM=====
    crypto isakmp policy 10
    encr aes
    authentication pre-share
    group 2
    lifetime 1200
    crypto isakmp key cisco address 1.1.34.3
    crypto gdoi group getvpn
    identity number 1234
    server address ipv4 1.1.34.3
    crypto map getvpn-map 10 gdoi
    set group getvpn
    interface Loopback0
    ip address 2.2.2.2 255.255.255.0
    interface Serial1/0
    ip address 1.1.24.2 255.255.255.0
    serial restart-delay 0
    crypto map getvpn-map
    router ospf 1
    log-adjacency-changes
    network 0.0.0.0 255.255.255.255 area 0
    ============
    show cryto ipsec sa on R2
    R2#sh cry ips sa
    interface: Serial1/0
        Crypto map tag: getvpn-map, local addr 1.1.24.2
       protected vrf: (none)
       local  ident (addr/mask/prot/port): (2.0.0.0/255.0.0.0/0/0)
       remote ident (addr/mask/prot/port): (1.0.0.0/255.0.0.0/0/0)
       current_peer 0.0.0.0 port 848
         PERMIT, flags={origin_is_acl,}
        #pkts encaps: 0, #pkts encrypt: 0, #pkts digest: 0
        #pkts decaps: 0, #pkts decrypt: 0, #pkts verify: 0
        #pkts compressed: 0, #pkts decompressed: 0
        #pkts not compressed: 0, #pkts compr. failed: 0
        #pkts not decompressed: 0, #pkts decompress failed: 0
        #send errors 0, #recv errors 0
         local crypto endpt.: 1.1.24.2, remote crypto endpt.: 0.0.0.0
         path mtu 1500, ip mtu 1500, ip mtu idb Serial1/0
         current outbound spi: 0xB4D74B58(3034008408)
         PFS (Y/N): N, DH group: none
         inbound esp sas:
          spi: 0xB4D74B58(3034008408)
            transform: esp-aes esp-sha-hmac ,
            in use settings ={Tunnel, }
            conn id: 3, flow_id: SW:3, sibling_flags 80000040, crypto map: getvpn-map
            sa timing: remaining key lifetime (sec): (4739)
            Kilobyte Volume Rekey has been disabled
            IV size: 16 bytes
            replay detection support: N
            Status: ACTIVE
         inbound ah sas:
         inbound pcp sas:
         outbound esp sas:
          spi: 0xB4D74B58(3034008408)
            transform: esp-aes esp-sha-hmac ,
            in use settings ={Tunnel, }
            conn id: 4, flow_id: SW:4, sibling_flags 80000040, crypto map: getvpn-map
            sa timing: remaining key lifetime (sec): (4739)
            Kilobyte Volume Rekey has been disabled
            IV size: 16 bytes
            replay detection support: N
            Status: ACTIVE
         outbound ah sas:
         outbound pcp sas:
       protected vrf: (none)
       local  ident (addr/mask/prot/port): (1.0.0.0/255.0.0.0/0/0)
       remote ident (addr/mask/prot/port): (2.0.0.0/255.0.0.0/0/0)
       current_peer 0.0.0.0 port 848
         PERMIT, flags={origin_is_acl,}
        #pkts encaps: 0, #pkts encrypt: 0, #pkts digest: 0
        #pkts decaps: 0, #pkts decrypt: 0, #pkts verify: 0
        #pkts compressed: 0, #pkts decompressed: 0
        #pkts not compressed: 0, #pkts compr. failed: 0
        #pkts not decompressed: 0, #pkts decompress failed: 0
        #send errors 0, #recv errors 0
         local crypto endpt.: 1.1.24.2, remote crypto endpt.: 0.0.0.0
         path mtu 1500, ip mtu 1500, ip mtu idb Serial1/0
         current outbound spi: 0xB4D74B58(3034008408)
         PFS (Y/N): N, DH group: none
         inbound esp sas:
          spi: 0xB4D74B58(3034008408)
            transform: esp-aes esp-sha-hmac ,
            in use settings ={Tunnel, }
            conn id: 1, flow_id: SW:1, sibling_flags 80000040, crypto map: getvpn-map
            sa timing: remaining key lifetime (sec): (4739)
            Kilobyte Volume Rekey has been disabled
            IV size: 16 bytes
            replay detection support: N
            Status: ACTIVE
         inbound ah sas:
         inbound pcp sas:
         outbound esp sas:
          spi: 0xB4D74B58(3034008408)
            transform: esp-aes esp-sha-hmac ,
            in use settings ={Tunnel, }
            conn id: 2, flow_id: SW:2, sibling_flags 80000040, crypto map: getvpn-map
            sa timing: remaining key lifetime (sec): (4739)
            Kilobyte Volume Rekey has been disabled
            IV size: 16 bytes
            replay detection support: N
            Status: ACTIVE
         outbound ah sas:
         outbound pcp sas:
    R2#

    First, I would say the sorryserver should be the CSS2 vip and not a server behind it.
    This is a feasible solution.
    The only important point is that CSS1 needs to see the response from the server, so you need to nat traffic on CSS1 with an ip address part of CSS1 subnet so that the server behind CSS2 can send the response to CSS1 and not directly to the client.
    You can do this with a group.
    ie:
    group natme
    vip x.x.x.x
    add destination service sorryserver1
    active
    Regards,
    Gilles.

  • Configuring simple scenario (File to File)

    I have done the first variant from Simple Use Case.
    I have got an error during activating change list
    Check result for Sender Agreement | ABP_FileSystem_XIPattern1 | XIPatternInterface1 | * | *:  Communication channel of sender " | ABP_FileSystem_XIPattern1 | XIChannel_FileSender " is of SAP adapter type "File". In this case, the receiver party must not be "*"
    My configuration is the same except I am using FTP Adapter...
    What did I wrong?

    Hi,
    i Think in your case it is possible to let receiver party empty.
    Try this.
    Regards,
    Robin

  • Conditional Calculations (IF-THEN-ELSE) - Simple Scenario

    I am referring to the link below:
    http://help.sap.com/saphelp_nw04/helpdata/en/23/17f13a2f160f28e10000000a114084/content.htm
    For test purposes I have created a simple condition the details are below
    ( ( 'RKF1' <> 'Quantity Total Stock' ) OR ( 'RKF2' <> 'Quantity Total Stock' ) * 1 * 2 )
    In my result set I would expect 1 to appear twice and 2 to appear once.  However, I am only getting 1.  why is this thanks?

    Pradip,
    this relates to Murali's post above, Murali provided this expression.
    You wrote
    +Case 1: Block 1 & Block 2 both are TRUE (Value 1)then
    output = ((1)OR(1)12) = 112 = 2
    Case 2: Both blocks are FALSE then
    Output = ((0)OR(0)12) = 0
    Case 3 : Block 1 is FALSE and Block 2 is TRUE then
    Output = ((0)OR (1)12) = 112 = 2
    Case 3 : Block 1 is TRUE and Block 2 is FALSE then
    Output = ((1)OR (0)12) = 112 = 2+
    Correct is
    Case 1: Block 1 & Block 2 both are TRUE (Value 1)then
    output = ((1)OR(1)12) = ((1) OR (2)) = 1</b>
    Case 2: Both blocks are FALSE then
    Output = ((0)OR(0)12) = 0
    Case 3 : Block 1 is FALSE and Block 2 is TRUE then
    Output = ((0)OR (1)12) = ((0) OR (2)) = 1
    Case 3 : Block 1 is TRUE and Block 2 is FALSE then
    Output = ((1)OR (0)12) = ((1) OR (1)) = 1</b>
    You provided the correct reason yourself: <b>The result is 1 if <Expression1> or <Expression2> does not equal 0</b>. Otherwise the result is 0.
    To achieve your results expression must be (note the brackets)
    ( ( 'RKF1' <> 'Quantity Total Stock' ) OR ( 'RKF2' <> 'Quantity Total Stock' ) ) *2
    =======================================================
    @Murali
    I think I do know why a smart use of addition enables you to model a logical OR without using the boolean expression. But it does not work vice versa - logical OR never becomes an addition.
    Best regards,
    Björn
    Message was edited by: Björn Grahlher

  • IDoc - File simple scenario using XI

    Good day everyone,
    As I remember when using IDocs on ABAP system it is possible to create a txt\xml file from the outbound Idoc in a specific library in the FS using just the ABAP system\IE.
    My question is:  when using XI how can I achieve the same result?
    Does configuring the inbound comm. channel as a <b>file without conversion</b> and setting the needed file encoding is enough to create the target file?
    Is this possible the other way around?
    A file without convertion,without mapping ,using interafce mapping with the same IDoc interface as a source and target?
    that is -  without message mapping at all?just interface mapping from the IDoc structure to itself?
    thanks.

    Hi,
    Have you had a look at this blog by Stefan, It describes how to use Files to colllect multiple IDOCs and then get them processed in XI,
    /people/stefan.grube/blog/2006/09/18/collecting-idocs-without-using-bpm
    if you want the target to be a XML file just as the IDOC itself, then no mapping and content conversion.
    If the output file is in different format, then you might need mapping and content conversion.
    Regards,
    Bhavesh
    Message was edited by: Bhavesh Kantilal

  • Best practice for simple scenario

    I'm working on demo of File To File integration.
    Mapping between two system is complicated.
    For example:
    There is a field (CODE) in SYSTEM 1. I have to perform SQL Query (SELECT ID FROM TABLE WHERE CODE=CODE_FROM_SYSTEM_1) This ID will be an ID  for SYSTEM 2
    Which way is the best.
    1. To write logic of mapping in SYSTEM 2. It will be some service that will perform import.
    I will use cache in order reduce number of SQL Queries
    (For example I perform this query only once SELECT ID,CODE FROM TABLE) each ID will be obtained from cache by CODE_FROM_SYSTEM_1
    2. To create some complicated Business process that will use JDBC Adapter for executing SQL Queries.
    But what about caching?
    What do you think? Which way will be the best?
    Message was edited by: Sergey A

    HI Sergey,
    if you can use the first approach and leave only message processing for XI
    albo if your files are big not using BPM will be a good idea
    unless your messages are very small then you cna try the second approach
    Regards,
    michal

  • Need of Simple Scenarios...

    Dear frnds,
    I m very new to ABAP.And I dont have any Knowledge about Business Processes and Process flows. (For example Purchase order flow)..
    So , I want to learn the basic ideas about the Process flows..  And I searched for the documents and sites which could explain me simply and clearly ( The basic thing which sholud be known by a good Abaper ). But i couldn't find..
    So, Friends .. plz guide me...
    Thanks in advance.

    SD,MM Flow
    Re: reg : document flow for SD and MM ?
    http://www.sap-img.com/sap-download/sap-tables.zip
    You can check the links..
    http://academic.uofs.edu/faculty/gramborw/sap/sapguide.htm
    http://www.sap-basis-abap.com/sapfiar.htm
    http://www.thespot4sap.com/IntroTo/SAP_FI_Module_Introduction.asp
    http://www.thespot4sap.com/IntroTo/SAP_CO_Module_Introduction.asp
    http://www.sap-basis-abap.com/sapco.htm
    http://help.sap.com/saphelp_40b/helpdata/en/e1/8e51341a06084de10000009b38f83b/applet.htm
    http://www.sapgenie.com/sapfunc/fi.htm
    http://www.sap-basis-abap.com/sapfi.htm
    http://help.sap.com/printdocu/core/Print46c/en/data/pdf/CAARCMM/CAARCMM.pdf
    http://help.sap.com/printdocu/core/Print46c/en/data/pdf/MYSAP/SR_MM.pdf
    http://help.sap.com/printdocu/core/Print46c/en/data/pdf/LOMDMM/LOMDMM.pdf
    http://help.sap.com/printdocu/core/Print46c/en/data/pdf/BCBMTWFMMM/BCBMTWFMMM.pdf
    http://help.sap.com/printdocu/core/Print46c/en/data/pdf/MMIVMVAL/MMIVMVAL.pdf
    here r the links for MM
    http://www.sapgenie.com/abap/tables_mm.htm
    http://www.sap-img.com/sap-download/sap-tables.zip
    http://www.allsaplinks.com/material_management.html
    http://www.training-classes.com/course_hierarchy/courses/2614_SAP_R_3_MM_Invoice_Verification_-_Rel_4_x.php
    http://www.sapfriends.com/sapstuff.html
    http://help.sap.com/printdocu/core/Print46c/en/data/pdf/PSMAT/PSMAT.pdf
    http://help.sap.com/printdocu/core/Print46c/en/data/pdf/CAARCMM/CAARCMM.pdf
    http://help.sap.com/printdocu/core/Print46c/en/data/pdf/MYSAP/SR_MM.pdf
    http://help.sap.com/printdocu/core/Print46c/en/data/pdf/LOMDMM/LOMDMM.pdf
    http://help.sap.com/printdocu/core/Print46c/en/data/pdf/BCBMTWFMMM/BCBMTWFMMM.pdf
    http://help.sap.com/printdocu/core/Print46c/en/data/pdf/MMIVMVAL/MMIVMVAL.pdf
    http://help.sap.com/printdocu/core/Print46c/en/data/pdf/MMWMLVS/MMWMLVS.pdf
    http://help.sap.com/printdocu/core/Print46c/en/data/pdf/MMISVE/MMISVE.pdf
    PO Flow
    http://sap-img.com/materials/what-is-the-dataflow-of-mm.htm
    SD Flow:
    http://www.sap-basis-abap.com/sd/sap-sd-processing-flow.htm
    http://www.sap-basis-abap.com/sapsd.htm
    MM Flow:
    http://www.erpgenie.com/sap/sapfunc/mm.htm
    FI :
    http://www.thespot4sap.com/IntroTo/SAP_FI_Module_Introduction.asp
    http://www.thespot4sap.com/IntroTo/SAP_CO_Module_Introduction.asp
    for tables check out link..
    http://www.sapgenie.com/abap/tables_fi.htm
    Take a look at this.
    http://www.sap-img.com/sap-fi.htm
    Please chcek this link and go to SAP R/3 Enterprise Application Components ->
    Cross-Application Components -> Financial.
    http://help.sap.com/saphelp_470/helpdata/en/e1/8e51341a06084de10000009b38f83b/frameset.htm
    you will go through this link
    http://www.sapgenie.com/sapfunc/fi.htm
    SD:Cycle:
    Enquiry&#61664;Quotation&#61664;Sales Order&#61664;Delivery(Picking, Packing, Post Goods
    Issue and Shipment)&#61664;Billing&#61664; Data to FI
    MM:Cycle:
    Purchase Requisition&#61664;Request for Quotation(RFQ)&#61664;(Vendor Evaluation)&#61664;Purchase Order(PO)&#61664;Goods Receipt Note(GRN)&#61664;Invoice Verification&#61664;Data to FI
    FI:
    General Ledger
    Accounts receivables
    Account Payables
    Spcial General ledger processing
    CO
    Profit center accounting
    Cost center Accounting.
    Detailed SD Flow is somewhat like this
    1. Inquiry Processing
    2.Quotation Processing
    3. Contract Processing
    4. Sales Order Processing
    5. Scheduling Agreement Processing
    6. Returns Processing
    7. Rebate Processing
    8. Sales Deal and Promotion Processing
    9. Display Customer and Material Information
    10. Billing Processing (online)
    11. Billing Processing (In The Background)
    12. Invoice List Processing
    13. Maintain Customer and Material Information
    14. Display Pricing
    15. Maintain Pricing
    16. Release Blocked Documents for Billing
    17. Release Sales Orders for Delivery
    18. Display Sales Information
    19. Display Billing Documents
    20. Sales Analysis
    21. Credit Management in Sales and Distribution Documents
    22. Backorder Processing
    23. Sales Support
    24. Output Processing
    MM flow:
    1)Purchase requestion
    2)Requestion for quatation
    3) Vendor selection
    4)Purchse order
    5) Goods reciept
    6) Invoice
    Regards
    Vasu

Maybe you are looking for