IP mobility solution for a Data Centers

Hello guys
Please give me an advice.
Now I try to work out a networking solution for a two geo separated data centers (DCs).
Both the data centers will contain a virtualised server infrastructure (VMware).
Now the main design problem is to choose good solution for IP mobility and  for the DCs interconnect. We need to have an ability of moving a VM machines both for maintanance and for disaster recovery between the DCs. And users must seamlessly without any advanced knowledge about routing  be connected to the VM machines that were moved. For now the users work in 1100 different offices around the Ukraine country.
Yes I know a little about Cisco OTV, RHI and LISP solutions. But all this solutions require the top level switches and routers like Nexus 7000 and Cat 6500. And they are very costly.
Is there cheaper solution for the IP Mobility?
Maybe the NHRP protocol that is a part of the DMVPN could be a good solution? 
Are there any features in NHRP or in other protocol for monitoring the health of a VM machine, customising and propagation the routing info in case of VM machine motion?
Thank you for help.

Hi Lavanya,
Required data from a legacy system(generate data only in the form of flat files)to SAP R3 as FB01 journals - use BDC for this thing because it will be better for large source files.
the output file should be generated periodically(daily,weekly,fortnightly etc…)  - if this output file contains acknowledgment for the data uploaded by the above process, create a ABAP report for it and schedule it..........but if this output contains some other IDOC data which you need to send as a file to a third-party system, then go for SAP XI provided the IDOC data is not too large... but if the IDOC size is huge, then just create a ABAP report for outputting data to a file on application server and FTP the file to third-party system.
Regards,
Rajeev Gupta

Similar Messages

  • Reg:Efficient solution for a data upload scenario

    Hi All,
            I have the following task.
             Required  data from a legacy system(generate  data only in the form of flat files)to SAP R3 as FB01 journals and the output file should be generated periodically(daily,weekly,fortnightly etc…)
    Solution Approaches:
    1)Write a BDC program to extract the data.
    2) Write a ABAP Progam to populate IDoc (if standard IDOc is available) or generate a Outbound proxy (If standard IDoc is not available) to push the data into SAP XI.
    Could anyone  tell me which would be the the best and efficient approach for this task and need your recommendations.
    Thanks in Advance.
    B.Lavanya
    Edited by: Lavanya Balanandham on Mar 31, 2008 2:23 PM

    Hi Lavanya,
    Required data from a legacy system(generate data only in the form of flat files)to SAP R3 as FB01 journals - use BDC for this thing because it will be better for large source files.
    the output file should be generated periodically(daily,weekly,fortnightly etc…)  - if this output file contains acknowledgment for the data uploaded by the above process, create a ABAP report for it and schedule it..........but if this output contains some other IDOC data which you need to send as a file to a third-party system, then go for SAP XI provided the IDOC data is not too large... but if the IDOC size is huge, then just create a ABAP report for outputting data to a file on application server and FTP the file to third-party system.
    Regards,
    Rajeev Gupta

  • Solution for Overlapping dates

    Hi
    I am looking for an easy solution the problem described below
    I have 4 records under party key and
    PartyKey account id taxtype date start date end
    123 145 100 01/01/2001 12/31/2002
    123 145 100 05/01/2001 05/31/2001
    123 145 100 01/01/2002 11/30/2002
    123 145 100 01/01/2003 12/31/2007
    124 111 300 01/01/2001 12/31/2006
    125 222 260 04/01/2001 11/30/2001
    125 222 260 01/01/2001 12/31/2001
    I want to select all the records which has overlapping dates with other set of record for the same partykey and other columns and want to list all those records which has no overlapping dates.
    for example for party key 123 first 3 records are overlapping date, and party key 125 has both the records has overlapping date.
    party key 124 has no over lapping date and last record of partykey 123 the dates are not overlapping with first three records
    Can any body help me in building the code. Its very Urgent please
    Thanks
    Rajesh

    My bad, i had to head out shortly so i rushed, and introduced some copy+paste errors....This should be what you're after, hopefully you'll take the time to understand it, since you'll have to support it :)
    ME_XE?select
      2     id_internal,cd_type_acct,cd_type_addr,id_acct,  dt_effect, dt_end,
      3     case when
      4        (dt_effect between lead(dt_effect) over (partition by id_internal,cd_type_acct,cd_type_addr,id_acct order by id_internal,cd_type_acct,cd_type_addr,id_acct, dt_effect asc) AND lead(dt_end) over (partition by id_internal,cd_type_acct,cd_type_addr,id_acct order by id_internal,cd_type_acct,cd_type_addr,id_acct, dt_effect asc) )
      5        OR
      6        (dt_end between lead(dt_effect) over (partition by id_internal,cd_type_acct,cd_type_addr,id_acct order by id_internal,cd_type_acct,cd_type_addr,id_acct, dt_effect asc) AND lead(dt_end) over (partition by id_internal,cd_type_acct,cd_type_addr,id_acct order by id_internal,cd_type_acct,cd_type_addr,id_acct, dt_effect asc) )
      7        OR
      8        (dt_effect between lag(dt_effect) over (partition by id_internal,cd_type_acct,cd_type_addr,id_acct order by id_internal,cd_type_acct,cd_type_addr,id_acct, dt_effect asc) AND lag(dt_end) over (partition by id_internal,cd_type_acct,cd_type_addr,id_acct order by id_internal,cd_type_acct,cd_type_addr,id_acct, dt_effect asc) )
      9        OR
    10        (dt_end between lag(dt_effect) over (partition by id_internal,cd_type_acct,cd_type_addr,id_acct order by id_internal,cd_type_acct,cd_type_addr,id_acct, dt_effect asc) AND lag(dt_end) over (partition by id_internal,cd_type_acct,cd_type_addr,id_acct order by id_internal,cd_type_acct,cd_type_addr,id_acct, dt_effect asc) )
    11    then
    12     'Overlap'
    13    else
    14     'No Overlap'
    15    end as overlaps
    16  from
    17  (
    18     select *
    19     from MV_ITS_PARTY_ADDRESS_EXCEP
    20     order by id_internal,cd_type_acct,cd_type_addr,id_acct, dt_effect asc
    21  );
           ID_INTERNAL       CD_TYPE_ACCT       CD_TYPE_ADDR ID_ACCT                              DT_EFFECT  DT_END     OVERLAPS
                 93535       300000042829                 10 300                                  10/01/2000 10/31/2000 Overlap
                 93535       300000042829                 10 300                                  10/01/2000 10/31/2000 Overlap
                 93535       300000042829                 10 300                                  11/01/2000 12/31/2001 No Overlap
           20001115051                  0                 60 0                                    01/01/2004 12/31/2004 No Overlap
           20001115051                  0                 60 0                                    01/01/2005 12/31/2005 No Overlap
           20001115051                  0                 60 0                                    01/01/2006 02/27/2006 Overlap
           20001115051                  0                 60 0                                    01/01/2006 02/27/2006 Overlap
           20001115051                  0                 60 0                                    01/01/2006 02/27/2006 Overlap
           20001115051                  0                 60 0                                    01/01/2006 02/27/2006 Overlap
           20001115051                  0                 60 0                                    01/01/2006 02/27/2006 Overlap
           20001115051                  0                 60 0                                    02/28/2006 12/31/9999 No Overlap
    11 rows selected.
    Elapsed: 00:00:00.18
    ME_XE?
    [pre]
    Message was edited by:
            Tubby                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

  • Is Workspace Manager the right solution for history data only

    I've been experimenting some with WM to just enable history on all tables in a application we have with about 100 tables.
    Everything works fine, except for the fact that we have lots of fk_constraints that require enabling of versioning in order.
    We know that changes to the Db layout will occur as the app develops with business requirement's.
    My questions is : Is WM the right solution here, as I don't see anyway to "transport" historical data in the event of a migration of old DB layout to a new layout?
    (db refactoring).
    /christian

    Hi Christian,
    When you execute enableversioning on the tables, you can specify the complete set of tables that you want to version, instead of having to maintain the correct table order.
    There shouldn't be a need to transport the data between tables. We support DDL changes on versioned tables by using the beginDDL/commitDDL procedures. If you do require the transport of data between databases, then note that we only support a full database import/export. The user guide covers both of these topics in greater detail.
    Regards,
    Ben

  • Looking for a proper analytical query/solution for below data

    I have data as shown below:
    mob_id     status_code     status_text     media_date
    39585     600     Online     04-Aug-09
    54988     600     Online     05-Aug-09
    39585     600     Online     05-Aug-09
    39585     600     Online     06-Aug-09
    39585     600     Online     07-Aug-09
    39585     600     Online     08-Aug-09
    39585     600     Online     09-Aug-09
    39585     600     Online     10-Aug-09
    39585     600     Online     11-Aug-09
    39585     600     Online     12-Aug-09
    39585     600     Online     13-Aug-09
    39585     600     Online     14-Aug-09
    39585     600     Online     15-Aug-09
    39585     600     Online     16-Aug-09
    39585     700     Deinstall     17-Aug-09
    54988     600     Online     06-Aug-09
    54988     600     Online     07-Aug-09
    54988     600     Online     08-Aug-09
    54988     600     Online     09-Aug-09
    54988     600     Online     10-Aug-09
    54988     600     Online     11-Aug-09
    54988     600     Online     12-Aug-09
    54988     600     Online     13-Aug-09
    54988     600     Online     14-Aug-09
    54988     600     Online     15-Aug-09
    54988     600     Online     16-Aug-09
    39585     600     Online     20-Aug-09
    39585     600     Online     21-Aug-09
    39585     600     Online     22-Aug-09
    39585     600     Online     23-Aug-09
    39585     600     Online     24-Aug-09
    39585     600     Online     25-Aug-09
    39585     700     Deinstall     26-Aug-09
    39585     600     Online     27-Aug-09
    39585     600     Online     28-Aug-09
    39585     600     Online     29-Aug-09
    39585     600     Online     30-Aug-09
    39585     600     Online     31-Aug-09
    39585     600     Online     01-Sep-09
    39585     700     Deinstall     02-Sep-09
    54988     600     Online     17-Aug-09
    54988     600     Online     18-Aug-09
    54988     600     Online     19-Aug-09
    54988     600     Online     20-Aug-09
    54988     600     Online     21-Aug-09
    54988     600     Online     22-Aug-09
    54988     600     Online     23-Aug-09
    54988     600     Online     24-Aug-09
    54988     600     Online     25-Aug-09
    54988     700     Deinstall     26-Aug-09
    69875     600     Online     20-Aug-09
    69875     600     Online     21-Aug-09
    69875     600     Online     22-Aug-09
    69875     600     Online     23-Aug-09
    69875     600     Online     24-Aug-09
    69875     600     Online     25-Aug-09
    69875     600     Online     26-Aug-09
    Using the above data I need to find out the below result set. Can any one help in this?
    occurrnace_seq     mob_id     start_media_date     end_media_date     no_of_days
    1     39585     04-Aug-09     17-Aug-09     13
    2     39585     20-Aug-09     26-Aug-09     6
    3     39585     27-Aug-09     02-Sep-09     6
    1     54988     05-Aug-09     26-Aug-09     21
    1     69875     20-Aug-09          null null
    Here start_media_date can be found with status_code=600 & end_media_date can be found with status_code=700.
    Please look that the mobility_id is starting multiple times.
    Any one can help me in producing this result using SQL or PL/SQL.
    Many thanks in advance.
    Thanks
    Guttiwas

    guttis wrote:
    Can I run this query on a 70 million records? Does it raise any performance problems. If you have any idea, just thorough some possible suggestions to protect such isses.Well, you can certailny run it on 70 million records. How long it will run depends on your hardware, Oracle and OS settings. Said that, there is a simpler solution:
    select  occurrenace_seq,
            mob_id,
            min(case grp when 'start-of-group' then media_date end) start_media_date,
            max(case grp when 'end-of-group' then media_date end) end_media_date,
            max(case grp when 'end-of-group' then media_date end) - min(case grp when 'start-of-group' then media_date end) no_of_days
      from  (
             select  t.*,
                     case
                       when status_text = 'Deinstall' then 'end-of-group'
                       when lag(status_text,1,'Deinstall') over(partition by mob_id order by media_date) = 'Deinstall' then 'start-of-group'
                     end grp,
                     sum(case status_text when 'Deinstall' then 1 else 0 end) over(partition by mob_id order by media_date) +
                     case lag(status_text,1,'Deinstall') over(partition by mob_id order by media_date) when 'Deinstall' then 1 else 0 end occurrenace_seq
               from  your_table t
      where grp in ('start-of-group','end-of-group')
      group by mob_id,
               occurrenace_seq
      order by mob_id,
               occurrenace_seq
    /With your sample:
    with t as (
               select 39585 mob_id,600 status_code,'Online' status_text, to_date('04-Aug-09','dd-mon-yy') media_date from dual union all
               select 54988,600,'Online',to_date('05-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('05-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('06-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('07-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('08-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('09-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('10-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('11-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('12-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('13-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('14-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('15-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('16-Aug-09','dd-mon-yy') from dual union all
               select 39585,700,'Deinstall', to_date('17-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('06-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('07-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('08-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('09-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('10-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('11-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('12-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('13-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('14-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('15-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('16-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('20-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('21-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('22-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('23-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('24-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('25-Aug-09','dd-mon-yy') from dual union all
               select 39585,700,'Deinstall', to_date('26-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('27-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('28-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('29-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('30-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('31-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('01-Sep-09','dd-mon-yy') from dual union all
               select 39585,700,'Deinstall', to_date('02-Sep-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('17-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('18-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('19-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('20-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('21-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('22-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('23-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('24-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('25-Aug-09','dd-mon-yy') from dual union all
               select 54988,700,'Deinstall', to_date('26-Aug-09','dd-mon-yy') from dual union all
               select 69875,600,'Online',to_date('20-Aug-09','dd-mon-yy') from dual union all
               select 69875,600,'Online',to_date('21-Aug-09','dd-mon-yy') from dual union all
               select 69875,600,'Online',to_date('22-Aug-09','dd-mon-yy') from dual union all
               select 69875,600,'Online',to_date('23-Aug-09','dd-mon-yy') from dual union all
               select 69875,600,'Online',to_date('24-Aug-09','dd-mon-yy') from dual union all
               select 69875,600,'Online',to_date('25-Aug-09','dd-mon-yy') from dual union all
               select 69875,600,'Online',to_date('26-Aug-09','dd-mon-yy') from dual
    select  occurrenace_seq,
            mob_id,
            min(case grp when 'start-of-group' then media_date end) start_media_date,
            max(case grp when 'end-of-group' then media_date end) end_media_date,
            max(case grp when 'end-of-group' then media_date end) - min(case grp when 'start-of-group' then media_date end) no_of_days
      from  (
             select  t.*,
                     case
                       when status_text = 'Deinstall' then 'end-of-group'
                       when lag(status_text,1,'Deinstall') over(partition by mob_id order by media_date) = 'Deinstall' then 'start-of-group'
                     end grp,
                     sum(case status_text when 'Deinstall' then 1 else 0 end) over(partition by mob_id order by media_date) +
                     case lag(status_text,1,'Deinstall') over(partition by mob_id order by media_date) when 'Deinstall' then 1 else 0 end occurrenace_seq
               from  t
      where grp in ('start-of-group','end-of-group')
      group by mob_id,
               occurrenace_seq
      order by mob_id,
               occurrenace_seq
    OCCURRENACE_SEQ     MOB_ID START_MED END_MEDIA NO_OF_DAYS
                  1      39585 04-AUG-09 17-AUG-09         13
                  2      39585 20-AUG-09 26-AUG-09          6
                  3      39585 27-AUG-09 02-SEP-09          6
                  1      54988 05-AUG-09 26-AUG-09         21
                  1      69875 20-AUG-09
    SQL> SY.

  • Solution for mass data import to VB02

    hello
    do you have any procedure to mass data import to VB02 exclusion list for ZSD1 3rd list key
    sales dep/ distribution chanel / material / customer
    we don't want to do that manually - more then 600 records
    do you have any idea how to import it - any easier way ?

    HI
    you can import the records eaither LSMW or BAPI which ever your are comfortable to do that.
    in LSMW we have 14 steps , i hope you know .
    thanks
    surya

  • XML Solutions for Large Data Sets

    Hi,
    I'm working with a large data set (9 million records comprising 36 gigabytes) and am exploring the use of XML with it.
    I've experimented with a JDBC app (taken straight from Steve Muench's excellent <i>Oracle_XML_Applications</i>) for writing to CLOBS, but achieve throughputs of much less than 40k/s (the minimum speed required to process the data in < 10 days).
    What kind of throughputs are possible loading XML records from CLOBs into multiple tables (using server-side Java apps)?
    Could anyone comment whether XML is a feasible possibility for this size data set?
    Regards,
    Mike

    Just would like to identify myself (I'm the submitter):
    Michael Driscoll <[email protected]>.
    null

  • Need suggestion for ISE distributed deployment model in two different data centers along with public certificate for HTTPS

    Hi Experts,
    I am bit confused about ISE distributed deployment model .
    I have two data centers one is DC & other one is as a DR I have  requirement of guest access service implementation using CWA and get public certificate for HTTPS to avoid certificate error on client devices :
    how do i deploy ISE persona for HA in this two data centers
    After reading cisco doc , understood that we can have two PAN ( Primary in DC  & Secondary in DR ) like wise for MnT (Monitoring will be as same as PAN ) however I can have 5 PSN running in secondary i.e. in DR ISE however I have confusion about HA for PSN .. since we have all PSN in secondary , it would not work for HA if it fails
    Can anybody suggest me the best deployment solution for this scenario ?
    Another doubt about public certificate :
     Public Certificate: The ISE domain must be a registered or part of a registered domain name on the Internet. for that I need Domain name being used from customer .
    Please do correct me if I am wrong about certificate understanding :
    since Guest will be the outside users , we can not use certificate from internal CA , we need to get the certificate from service provider and install the same in both the ISE servers
    Can anybody explain the procedure to opt the public certificate for HTTPS from service provider ? And how do i install it in both the ISE servers ?

    Hi there. Let me try answering your questions:
    PSN HA: The PSNs are not configured as "primary" or "secondary" inside your ISE deployment. They are just PSN nodes as far as ISE is concerned. Instead, inside your NADs (In your case WLCs) you can specify which PSN is primary, which one is secondary, etc. You can accomplish this by:
    1. Defining all PSN nodes as AAA radius servers inside the WLC
    2. Then under the SSID > AAA Servers Tab, you can list the AAA servers in the order that you prefer. As a result, the WLC will always use the first server listed until that server fails/gets reloaded, etc. 
    3. As a result, you can have one WLC or SSID prefer PSN server A (located in primary DC) while a second WLC or SSID prefer PSN server B (located in backup DC)
    Last but not the least, you could also place PSNs behind a load balancer and that way the traffic would be equally distributed between multiple PSNs. However, the PSN nodes must be Layer 2 adjacent, which is probably not the case if they are located in two different Data Centers
    Certificates: Yes, you would want to get a public certificate to service the guest portal. Getting a public/well known certificate would ensure that most devices out there would trust the CA that signed your ISE certificate. For instance, VeriSign, GoDaddy, Entrust are some of the ones out there that would work just fine. On the other hand, if you use a certificate that was signed by your internal CA, then things would be fine for your internal endpoints that trust your internal CA but for any outsiders (Guests, contractors, etc) that do not trust and do not know who your internal CA is would get a certificate error when being redirected to the ISE guest portal. This in general is only a "cosmetic" issue and if the users click "continue" and add your CA as a trusted authority, the guest page would load and the session would work. However, most users out there would not feel safe to proceed and you will most likely get a lot of calls to your helpdesk :)
    I hope this helps!
    Thank you for rating helpful posts!

  • SAP AutoID Infrastructure vs Sybase SUP Mobile based solutions for RFID???

    Hello All,
    I'm trying to come up to speed on the basics as we have projects going that have RFID, Barcode and mobile design implications. As mobile has evolved we're looking at some opportunities give the number of consumer level devices available. We know we won't be able to simply replace all existing (ruggedized, etc) devices but the oppertunity is there to do some new things, or old things with new devices. At the same time we have our existing RFID infrastructure (Not SAP AII). My question is... is there any overlap between an SAP AII infrastructure and Mobile Infrastructure? For example from what I gather an AII infrastructure typically has an RFID middleware for data management device monitoring, etc that is integrated with SAP AII. Sounds very similar to mobile architecture with a middleware piece taking care of decide management data integration & management etc. Kind of troubling to me that 2 pieces of heavy infrastructure are needed for these two scenarios. Oh and lets talk XMII also which we also have installed.
    So just comming up the curve on all this. Is there any overlap here where RFID architecture could be handled by a good mobile design? I suspect any overlap is conceptual and the rub will be the device support.
    Thanks,
    Doug

    Hi
    I have been following SAP Mobility as well. These are my gathering.
    SAP Sybase SUP - This is being recommended for heavy weight applications. There are additional components on the SUP layer as, Sybase Control Center and Afaria for monitoring and device management.
    Project Gateway -  This is being recommended for light weight application. It does not require any additional infrastructure, only software components to be added. I have not heard of any release date for this product. Conceptually it seems very good and based on the demo that was presented in Tech Ed 2011, the development was very simple as well.
    ITS Mobile - This is integrated ITS in ECC. SAP console has been replaced with product. This is used mainly for the RF transactions in Warehouse Mgt. No additional infrastructure, some minimum configuration and ABAP development.
    BI Mobile - This requires an additional infrastructure and software components to be added where by BI can interface with the mobile devices.
    SAP Sybase have also built solutions for Workflow using the SUP platform.

  • Bandwidth requirement for a Mobile solution

    Hi
    Is there any data that I can use to determine the bandwidth requirement needs for a mobile solution? (In a connected mode). Lets assume that we have only 1 user using only one app/screen - mobile sales order. For each sales order, what will be the bandwidth used? Assumption is that the screen is also loaded and data is also transmitted.
    Thanks in advance

    Hello Robert,
    please have a look at the SDN Mobile Infrastructure "homepage" and select the implementation case studies from the e-learning iview. Maybe there you can get an impression about possible implementation efforts.
    Regars,
    Tim

  • TS3297 "iTunes was unable to load data class information from Sync Services." What is this solution for the new phone to sync up with my iTunes acct?

    "iTunes was unable to load data class information from Sync Services." What is this solution for this new phone to sync up with my iTunes acct?

    Hi tadhunt,
    Here is an article that directly addresses that error message:
    iTunes for Windows: "Unable to load data class" or "Unable to load provider data" sync services alert
    http://support.apple.com/kb/ts2690
    I hope this helps!
    - Ari

  • I have a production mobile Flex app that uses RemoteObject calls for all data access, and it's working well, except for a new remote call I just added that only fails when running with a release build.  The same call works fine when running on the device

    I have a production mobile Flex app that uses RemoteObject calls for all data access, and it's working well, except for a new remote call I just added that only fails when running with a release build. The same call works fine when running on the device (iPhone) using debug build. When running with a release build, the result handler is never called (nor is the fault handler called). Viewing the BlazeDS logs in debug mode, the call is received and send back with data. I've narrowed it down to what seems to be a data size issue.
    I have targeted one specific data call that returns in the String value a string length of 44kb, which fails in the release build (result or fault handler never called), but the result handler is called as expected in debug build. When I do not populate the String value (in server side Java code) on the object (just set it empty string), the result handler is then called, and the object is returned (release build).
    The custom object being returned in the call is a very a simple object, with getters/setters for simple types boolean, int, String, and one org.23c.dom.Document type. This same object type is used on other other RemoteObject calls (different data) and works fine (release and debug builds). I originally was returning as a Document, but, just to make sure this wasn't the problem, changed the value to be returned to a String, just to rule out XML/Dom issues in serialization.
    I don't understand 1) why the release build vs. debug build behavior is different for a RemoteObject call, 2) why the calls work in debug build when sending over a somewhat large (but, not unreasonable) amount of data in a String object, but not in release build.
    I have't tried to find out exactly where the failure point in size is, but, not sure that's even relevant, since 44kb isn't an unreasonable size to expect.
    By turning on the Debug mode in BlazeDS, I can see the object and it's attributes being serialized and everything looks good there. The calls are received and processed appropriately in BlazeDS for both debug and release build testing.
    Anyone have an idea on other things to try to debug/resolve this?
    Platform testing is BlazeDS 4, Flashbuilder 4.7, Websphere 8 server, iPhone (iOS 7.1.2). Tried using multiple Flex SDK's 4.12 to the latest 4.13, with no change in behavior.
    Thanks!

    After a week's worth of debugging, I found the issue.
    The Java type returned from the call was defined as ArrayList.  Changing it to List resolved the problem.
    I'm not sure why ArrayList isn't a valid return type, I've been looking at the Adobe docs, and still can't see why this isn't valid.  And, why it works in Debug mode and not in Release build is even stranger.  Maybe someone can shed some light on the logic here to me.

  • Need solution for the following sync error: "iTunes could not sync calendars to the iPad "iPad name" because an error occurred while sending data from the iPad"

    Need solution for the following sync error: "iTunes could not sync calendars to the iPad "iPad name" because an error occurred while sending data from the iPad"

    I want to add that I deleted all the old back-ups and created a new back-up without any issues except sync problem.

  • EIS - Is this the solution for getting to Alpha data?

    Is there any other way to get to alpha data(stored in FISERV), through Essbase?

    As Jeff mentioned, unfortunately, the info provided is incorrect. Most external hard drives do not have the ability to act as a host device. Meaning, when you plug your camera into a computer, it generally mounts the camera as a drive (or provides similar functionality). You then use your computer to pull the information from the camera. If you simply connect an external drive directly to the camera (as this solution suggests), there is no way to get the data from the camera to the drive. You can generally daisy chain firewire connections... meaning you can connect your camera to a firewire drive and if the firewire drive has multiple firewire ports, you can then connect the second port to a computer with firewire. This is not an option for you as you already know there is no firewire on the MacBook. There is some misconception that for external drives with multiple interfaces (firewire and USB and or eSATA) that you can connect a camera to the drives firewire interface and then connect the drive to your computer using the USB connection. Unfortunately, you can't daisy chain across different interfaces, the drive can only support data throughput through one interface at a time.
    Still an external drive is good to have for backups, etc... but it is not a solution for your problem.

  • A Guide for Testing SAP NW MI & its Mobile Solutions (MAM, MAU, MTT, ...)

    Hi Guys,
    As per my experience, there is no enough Methodology & Tools to test the SAP Mobile solutions landscape. There are scattered documents from SAP, which are neither enough nor sufficient to carry out the different required tests from Unit test till the stress & volume tests before the Go-Live in a reliable way!
    I would appreciate it very much if we combine all our experiences to compile a guide about this topic, which we can all share and even continue updating it as SAP guys further develop their Mobile Technology.
    I would suggest that this SAP Mobility Solutions Test Guide should contain the main sections:
    - Testing NW MI MW server.
    - Testing the ECC as a MI backend system
    - Critical business processes of the mobile scenario (MAM for example)
    - Testing the Mobile Applications on clients devices (MAM, MAU, MTT, …)
    - Performance tuning tips for all the involved software components
    - Performance tuning tips for the all involved servers.
    The guide should contain methodologies, tools, reference to SAP documents, sample actual testing results, ...
    I would volunteer to compile your feedback into final master document and publish the final guide in SDN for all of us.
    If somebody compiled such guide, it would be great if you publish it in the SDN.
    Your feedback about this idea and contents for this guide is much appreciated & sure will help all of us to deliver better quality.
    Thanks
    Ahmed El Adl

    Hi Ahmed,
    Its a nice thread to start with. We shall do the same.
    We are testing MI for past 1.6 yrs.
    <<<<- Testing NW MI MW server.
    - Testing the ECC as a MI backend system
    - Critical business processes of the mobile scenario (MAM for example)
    - Testing the Mobile Applications on clients devices (MAM, MAU, MTT, …)
    - Performance tuning tips for all the involved software components
    - Performance tuning tips for the all involved servers.>>>>
    Testing NW MI MW server----The entire backend side developed application can be automated(covering many features)
    We have been executing few standard test cases and enhancing them.
    So can you explain more clearly about what sort of methodologies you are talking about.
    We generally analyze customer raised messeges and try to pump in more test cases to cover the features.
    May be this is the starting to get more enchaments methodologies to improve MI.
    FYI..In Newyork release MI architecture is going to be changed a bit...so can expect better performance(Webdynpro from client side) 
    Cheers,
    Karthick

Maybe you are looking for