Regarding the migration from mi 2.5 to Netweaver mobile

Dear All,
We need to migrate an mi 2.5 to netweaver mobile 7.1. I have read a white paper by SAP for the same which has six step approach. Even i have read certail documents which has certain clauses like if a filter has placed on two fields in mi 2.5, that rule cannot be migrated automatically, which need to be created manually.
So i need to understand the following things.
1. What are limitations/clauses  in the modelling regarding the migration to netweaver mobile?
2. As there are data objects, associations, dependancies, distribution model, distribution rules in netweaver mobile, what are the corrosponding things in mi 2.5, to acheve the data distribution dynamically and depandancies between the syncbos?
3. What are all things i need to study regarding the current implementation of mi 2.5 in terms of data modelling and mobile application?
4. LImitations of mi 2.5/7.0 over Netweaver mobile 7.1
Can anyone please help me in this so that it will be great help for me.
Regards,
Kiran.

Thanks Ajith,
This is really good piece of input. Did you get any chance to gothorugh my word document. There are couple of clarifications what i have need is mentioned in that with the screen shots.
Clarifications are:
1. There are filters created for certain syncbos. They seems to be dynamic filters, so where can i see in the current mi 2.5 system where these values like ME-SYNC_USER is configured. This is the field that they have used for the dynamic filtration of data.
2. And there are no syncbos which have u201CFiltered by referencing Syncbosu201D enabled with the reference to excel and screenshot in the word document. Is this the way to look at related syncbos and if yes does it mean that there are no relations between the syncbos in the current mi 2.5 system?
3. Even though there are entries in merep_408 table for certain syncbos but i am unable to view the syncbo exits using the transaction merep_sbuilder. It shows the exits for ZEFACTIVITY but not for the others. What does that mean, is that mean exits are not maintained for the other syncbos.
4. Is S01 and T01 syncbos converted to Data Objects in Netweaver mobile. If they are converted, can you please explain clearly what is meaning of not supported and unable to use the capabilities of netweaver mobile 7.1, bacuse i am unable to understand.
5. Is the laptop client O/S dependant or if we install the respective jvm then does the same client run on all O/S.
6. While synchronization from the standard client it hits standard sync service provided by SAP, if the concurrency is more or overloaded, then what will be the alternative? In this i am not talking about load balancing. Can we do any routing to other server, is that the network configuration and any configuration done at DOE level?
I am really bombarding you with so many clarifications but can you please help in this so that it really helps me alot.
Regards,
Kiran.

Similar Messages

  • Accidently removed a mailbox on the SBS 2008 server during the migration from SBS 2003

    I accidently removed a legacy mailbox from the SBS 2008 server while still doing the migration from SBS 2003.  I meant to click on the Move Mailbox like the instructions said but clicked on the Remove option, then clicked the 'X' to close the box since
    I figured that would cancel what I did however it removed the mailbox from the SBS 2008 list.
    Please tell me that I can get it back.  It still shows up in the old SBS Exchange System Manager but now when an email is sent, it sends back a no delivery message.  It's like the account doesn't exist any more.
    Can it be recovered and if so, how?
    thanks,
    Jeff

    Hi Jeff,
    Would you please let us know current situation of this issue? Just check if above suggestions can help you.
    If any update, please feel free to let us know.
    In addition, please also refer to following article and thread, and then check if can help you.
    How
    to use a Recovery Storage Group in SBS 2008
    How
    to recover a deleted user and mailbox on SBS 2008
    Hope this helps.
    Best regards,
    Justin Gu

  • Shared folders not visible after the migration from 10.1.3.2.1 to 10.1.3.4

    Hi All,
    After the migration from 10.1.3.2.1 to 10.1.3.4., we are not able to view the shared folders after clicking on the answers link.
    Can you please let me know how can i resolve the same.
    Thanks in Advance
    Siva.

    Hi,
    Please check the catalog folder that user privilege and manage permission.
    Thanks
    Deva

  • Got new iMac, I had completed the migration from PC to new Apple computer, but still unable to see the files on the iTunes

    got new iMac, I had completed the migration from PC to new iMac, but still unable to see all the apps and music that were saved from the old PC to new iMac. Please help.

    My guess is the 30-day trial period has expired but you didn't remove the trial before installing the full version. The files to delete are the iWork ’09 folder from the main HD > Applications; the iWork ’09 folder in HD > Library > Application Support & the individual iWork application plist files found in HD > Users > (your account) > Library > Preferences for each user.
    Yvan Koenig has written an AppleScript that removes the files. You can find it on his iDisk in For_iWork > iWork '09 > uninstall iWork '09.zip.

  • Should I restart the migration from my iMac to my new MacBook Pro?

    Hi everyone,
    I've been a Mac user for many years now but I'm stuck with my current situation.
    I've just received my new MacBook Pro 13" 2.4Ghz and it's the first Mac I've owned for many years without Firewire. In the past transferring everything to a new Mac was simple but with this current migration I'm stuck.
    My MBP is connected to my iMac via my wifi network and everything seemed to be going to plan, albeit with a long estimated completion time (88 hours & 33 minutes for I guess approximately 200 or so gig of data). Now I'm over 24 hours into the migration and both computers are saying the same message ("Transferring documents for the user Elliot Pearson") and still with the 88 hour remaining time. The 'candystripe' bar is still moving on both but there's no way of seeing if anything is actually happening.
    What I want to know is whether it is worth, or indeed dangerous to, disconnect the transfer and start again potentially leaving incomplete or corrupted files. If the above seems normal and I just have to wait it out then I don't want to have to start again from scratch!
    Is it possible to connect the two via ethernet to do the transfer as I only remember being given the option for wifi which will inevitably be slower.
    Any advice you can give me will be much appreciated!
    Elliot

    The MBP has a Firewire port. There's no need to migrate over a network nor would I recommend it.
    Use Target Disk Mode - Transferring files between two computers using FireWire. Boot the computer from which you are migrating into TDM. Boot the MBP normally after which you should find the hard drive icon for the other computer on the Desktop. Open Migration Assistant in the Utilities folder, select "Another drive" as the source then select the drive of the iMac.
    You can abort the transfer which will mainly leave the MBP in an unknown state for what has already been migrated. In this case I would simply locate the /Users/ folder and delete whatever is in there so you can start fresh with the migration.
    Of course you can leave everything alone until it's completed. I don't know how many GBs you are migrating, but using wireless is extremely slow - even at 802.11n speed it's 1/4 the speed of a USB drive connected directly.

  • Will apple store do the migration from an Macbook Pro to Macbook Air?

    I just purchased today a refurbished Macbook Air from Apple. I need to have all of the software, program files, emails, and the ktichen migrated to it from my Macbook Pro. Will my local Apple Store do the migration for me if I take them the receipt for my Macbook Air?

    I am certain they will, I am less certain they will do it for free.
    I seem to recall their charge is $50 but this may have changed.

  • Regarding database migration from Oracle to MS SQL

    Hi All,
    In my application, the existing database is oracle. we need to migrate from Oracle db to MS SQL.
    We need procedure is as below
    1.export data from Oracle and keep it in a file.(.dmp)
    2.Import data to MS SQL from the exported oracle file.
    and vice versa from MS SQL to Oracle.
    my question is- is it possible to import .bmp (oracle export data) into MS SQL.
    Please clarify me.

    There are couple of options to get data from Oracle to MS SQL.
    The simplest is to create a linked server in MS SQL that points to Oracle and copy data with a query like select * into my_mssqlTable from oracle.schema.table; via the linked server.
    Some data types will not be supported.
    That is for one time copy.
    If you need a close to real time replication you can use MS replication (I would not recommend because it creates triggers on Oracle tables), or you can use GoldenGate or DataCurrents or other replication products.

  • Migrated with time machine.   Had all programs, photos, music, etc.  Today all photos, music and some programs disappeared.  updated with the new software fix.  Can I do the migration from time machine again?

    I migrated from old machine with Leopard to new machine with Lion.  After migration, I had all photos, music, applications, etc.  This morning there is no music, no photos and missing programs.  I have now downloaded the fix they put out yesterday.  Can I simply redo a new migration from Time Machine to get those things back?

    Monty,
    If you used Migration Assistant what has probably occurred is it created another user account. Log out of your current account and see if there is another account, if so log into that one and see if your information is there.

  • HT4796 does the migrate from pc to mac work? I have been running it for 20 hours and it looks to hung up on still copying my music and i dont have that much.

    Need help on the migration back up?

    Karsten,
    Most interesting reply and I certainly appreciate the sample illustration -- it's most helpful. However, I have this effect enabled and what happens is quite odd. Let me explain a little better as maybe I didn't word it correctly. Here's what occurs...
    1- Before the slide transitions to the next slide, it becomes fuzzy (not part of the transition itself which comes after)
    2- After the fuzziness the slide sort of notches upward a hairline before the transition. Thus, here it is: Normal Slide > Fuzziness > Notch > Transition > Normal Slide -- repeats.
    Side Note: I noticed on my daughter's DVD player, the side would do the above, but would know go a step further. the imposed soundtrack would actually pause for a split second (in addition to the above) before the transition itself.
    I called mac and was offered some studio 4 pro, and when I looked at the price I almost fainted ($999.99) which is quite steep for me and certainly more software than I need. That said, I'm looking into iMovie as an alternative (iDVD would have been great and fast on the work-flow), but I need to come up with a system using my Mac as I may have a wedding coming within the next several months. Before I offer this service, I need to learn the Mac (which I love) and the software of choice. I went to a local book store and purchased this book:
    http://www.pricegrabber.com/search_getprod.php/isbn=9780470502129/
    hopefully it will help me.
    Sincerely,
    -Dave

  • Regarding the dataload from the Psa to infocube

    Hi Experts,
                           I am having the dought when i am loding the data into the infocube the data is not updating into the infocube .But the data is present in the PSA abt 4 millions of records are there when i am running the DTP from PSA to the Infocube the data is not updating but it is showing the yellow color signal and in PSA totally 20 packets are there each packet is containning abt 50,000 records Can you suggest me what are all the steps i have to take if i want to load the data into an infocube from PSA threw DTP Because i am new to this field .
                                                           Bye.

    Hi Vinay,
    Check this link...this solves all ur worries..
    http://help.sap.com/saphelp_nw2004s/helpdata/en/42/f98e07cc483255e10000000a1553f7/frameset.htm
    Also,
    Performance Tips for Data Transfer Processes  
    Request processing, the process of loading a data transfer process (DTP), can take place in various parallelization upgrades in the extraction and processing (transformation and update) steps. The system selects the most appropriate and efficient processing for the DTP In accordance with the settings in the DTP maintenance transaction, and creates a DTP processing mode.
    To further optimize the performance of request processing, there are a number of further measures that you can take:
    ●      By taking the appropriate measures, you can obtain a processing mode with a higher degree of parallelization.
    ●      A variety of measures can help to improve performance, in particular the settings in the DTP maintenance transaction. Some of these measures are source and data type specific.
    The following sections describe the various measures that can be taken.
    Higher Parallelization in the Request Processing Steps
    With a (standard) DTP, you can modify an existing system-defined processing by changing the settings for error handling and semantic grouping. The table below shows how you can optimize the performance of an existing DTP processing mode:
    Original State of DTP Processing Mode
    Processing Mode with Optimized Performance
    Measures to Obtain Performance-Optimized Processing Mode
    Serial extraction and processing of the source packages (P3)
    Serial extraction, immediate parallel processing (P2)
    Select the grouping fields
    Serial extraction and processing of the source packages (P3)
    Parallel extraction and processing (P1)
    Only possible with persistent staging area (PSA) as the source
    Deactivate error handling
    Serial extraction, immediate parallel processing (P2)
    Parallel extraction and processing (P1)
    Only possible with PSA as the source
    Deactivate error handling
    Remove grouping fields selection
    Further Performance-Optimizing Measures
    Setting the number of parallel processes for a DTP during request processing.
    To optimize the performance of data transfer processes with parallel processing, you can set the number of permitted background processes for process type Set Data Transfer Process globally in BI Background Management.
    To further optimize performance for a given data transfer process, you can override the global setting:
    In the DTP maintenance transaction, choose Goto ®  Batch Manager Setting .Under Number of Processes, specify how many background processes should be used to process the DTP. Once you have made this setting, remember to save.
    Setting the Size of Data Packets
    In the standard setting in the data transfer process, the size of a data packet is set to 50,000 data records, on the assumption that a data record has a width of 1,000 bytes. To improve performance, you can increase the size of the data packet for smaller data records.
    Enter this value under Packet Size on the Extraction tab in the DTP maintenance transaction.
    Avoid too large DTP requests with a large number of source requests: Retrieve the data one request at a time
    A DTP request can be very large, since it bundles together all transfer-relevant requests from the source. To improve performance, you can stipulate that a DTP request always reads just one request at a time from the source.
    To make this setting, select Get All New Data in Source by Requeston the Extraction tab in the DTP maintenance transaction. Once processing is completed, the DTP request checks for further new requests in the source. If it finds any, it automatically creates an additional DTP request.
    With DataSources as the source: Avoid too small data packets when using the DTP filter
    If you extract from a DataSource without error handling, and a large amount of data is excluded by the filter, this can cause the data packets loaded by the process to be very small. To improve performance, you can modify this behaviour by activating error handling and defining a grouping key.
    Select an error handling option on the Updating tab in the DTP maintenance function. Then define a suitable grouping key on the Extraction tab under Semantic Groups. This ensures that all data records belonging to a grouping key in a packet are extracted and processed.
    With DataStore objects as the source: Do not extract data before the first delta or during full extraction from the table of active data
    The change log grows in proportion to the table of active data, since it stored before and after-images. To optimize performance during extraction in the Fill mode or with the first delta from the DataStore object, you can read the data from the table of active data instead of from the change log.
    To make this setting, select Active Table (with Archive) or Active Table (without Archive) on the Extraction tab in Extraction fromu2026 or Delta Extraction fromu2026 in the DTP maintenance function.
    With InfoCubes as the source: Use extraction from aggregates
    With InfoCube extraction, the data is read in the standard setting from the fact table (F table) and the table of compressed data (E table). To improve performance here, you can use aggregates for the extraction.
    Select data transfer process Use Aggregates on the Extraction tab in the DTP maintenance transaction. The system then compares the outgoing quantity from the transformation with the aggregates. If all InfoObjects from the outgoing quantity are used in aggregates, the data is read from the aggregates during extraction instead of from the InfoCube tables.
    Note for using InfoProviders as the source
    If not all key fields for the source InfoProvider in the transformation have target fields assigned to them, the key figures for the source will be aggregated by the unselected key fields in the source during extraction. You can prevent this automatic aggregation by implementing a start routine or an intermediate InfoSource. Note though that this affects the performance of the data transfer process
    Hope this helps u..
    VVenkat..

  • Regarding the upgradations from the bw3.5 to bi7.0 reporting

    Hi Experts,
                       Can any one help me how i have to upgrade the  reporting tool from the sap-bw 3.5 to sap-bi7.0 . right now allready we are having the sap-bw 3.5 version right now i want to upgrade it to the sap-bi 7.0 .I need clearly with the help of the steps.
                                                    Bye.

    Hi Vijay,
    please check the following links:
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/6a19f233-0e01-0010-6593-c47af5a8df3b
    https://www.sdn.sap.com/irj/sdn/event/webinars?rid=/library/uuid/90d58e76-e1b0-2a10-8b89-af8ceacf4b09
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/472443f2-0c01-0010-20ab-fbd380d45881
    /message/2082536#2082536 [original link is broken]
    Regards
    Erwin
    Edited by: Erwin  Buda on Feb 9, 2009 11:35 AM

  • Regarding ClearCase migration from SunOS 5.8 to SunOS 10

    Hi All,
    We are in need to migrate our current ClearCase 2002.05.00 from SunOS 5.8 to SunOS 10 (UNIX)
    Cam any one suggest me the best way to migrate …??
    Thanks in advance.

    Since you are jumping several compiler releases and two Solaris releases, you should migrate in two steps.
    1. Get your application working with Sun Studio 11 on Solaris 8.
    2. You can now run that application on Solaris 10, or you can rebuild it on Solaris 10 and then run it.
    An application built on Solaris 8 will run on Solaris 9 and 10.
    An application built on Solaris 10 cannot be run on Solaris 8 or 9.
    We promise upward binary compatibility, allowing an old binary to be used with a newer compiler or on a newer Solaris. But we can't promise source code compatibility because we don't want to have to duplicate old bugs in old compilers.
    C++ 5.8 in Sun Studio 11 detects errors that older compilers might have missed. The rules about where to use "typename" are one example.
    I would not expect the warning about typename to be generated by either of the source code lines that you show, but I can't really tell without seeing a compilable example.
    You need to use typename when you use a type that depends on a template parameter. Example:
    template <class T> class C1 {
    public:
        typedef T Type;
    template <class T> class C2 {
    public:
        typename T::Type a;
    C2< C1<int> > c2i;If in C2 you do not specify typename, the compiler does not know whether T::Type is intended to be the name of a type. If it is not a type, the code is always invalid. The C++ rule is that if an an identifier is not known to be a type, it is not a type. You use typename to tell the compiler that T::Type is supposed to be the name of a type.

  • How do I reverse the migrate from my PC?

    Had I known that this would be such a disaster I would have never had done it. I was told today that I have to reinstall my OS but an Apple doesnt come with a copy of the OS like a simple crappy windows machine.

    You don't need to reinstall OS X. You just have to create a new user and delete the old one. As Migration Assistant doesn't transfer applications from Windows, it's much easier.
    To create a new user, open System Preferences > Users & Groups, press the + button at the bottom left corner and create a new administrator account. Then, go to  > Log Out, and log in the new user. Finally, open System Preferences > Users & Groups, choose your old user and press the - button at the bottom left corner.
    Respecting to reinstall OS X, just in case you don't know how it works, Macs come with OS X Recovery, that allows you to reinstall OS X and other things > http://support.apple.com/kb/HT4718

  • Error happened during the migration from sql server 2000 to oracle 8.1.7

    /*[SPCONV-ERR(18)]:CREATE TABLE TT_CHANP1 statement passed to ddl file*/
    DELETE FROM TT_CHANP1;
    error as above.
    note:
    chanp1 is a temporary table under sql 2000
    syntax:
    create table #chanp1 (chanpid varchar(50))
    we try the codes:
    CREATE TEMP TABLE chanp1
    Error: PLS-00103: "CREATE"
    begin declare end exception
    exit for goto if loop mod null pragma raise return select update while
    <an identifier><a double-quoted delimited-identifier>
    <a bind variable><<close current delete fetch lock insert open
    roll back save point set sql execute commit for all
    <a single-quoted SQL string>
    Line: 63
    we try " CREATE GLOBAL TEMPORARY TABLE chanp1" again.
    but error .

    it seems like you are trying to perform a DDL from within the PL/SQL code, which is not allowed directly.
    You could use EXECUTE IMMEDIATE within your pl/sql code to perform a DDL statement.

  • Migrating from CUCM 8.6 to CUCM 10.0 - Do I have to upgrade presence as well?

    Hello Raees, 
    I've got a question regarding the Migration from CUCM 8.6 to 1.0, I went through the below "Ask the Expert" article.
    https://supportforums.cisco.com/discussion/12103966/ask-expert-cisco-unified-communications-manager-cucm-version-100
    My question is, the CUCM 10.0 upgrade mentions the following on page 22 
    "Refresh upgrade of Unified Communications Manager nodes only
    Complete the high-level tasks listed in this section when you want to perform a refresh upgrade of only the
    Unified Communications Manager nodes in your network. When you upgrade Unified Communications
    Manager nodes without upgrading IM and Presence nodes, ensure that the installed version of the IM and
    Presence Service is compatible with the new version of the Unified Communications Manager software."
    I would only like to Upgrade the CUCM nodes but the client is using Presence 8.6 as well, would I have to upgrade Presence 8.6 to something else as well or can I just upgrade to CUCM 10.0 and the presence 8.6 would work with it, if I have to upgrade Presence as well then what version of Presence would I have to upgrade to and for the "Refresh Upgrade for Presence" can you mention the SKU that I got to get off the Product Upgrade Tool.
    Since it's going to be a Refresh Upgrade for CUCM from 8.6 to 10.0 can you as well provide me with the SKU for the upgrade that I got to purchase, I can see two of them as follows:
    UCM-10.X-SW-K9-XU=
    CUCM Software Version 10.X for PUT Only, Export Unrestricted
    UCM-10.X-SW-K9=
    CUCM Software Version 10.X for PUT Only
    Thanks
    Aamer

    Thank you for the reply Chris, 
    I have a few more questions with regards to what you mentioned in your reply, I'd be glad if you could clarify:
    I have not dealt with IMP before...
    1) is it like a plugin that runs on CUCM instead of a full fledged server or is it a stand alone server like CUPS (just renamed/rebranded).... I have till date been dealing with CUPS as a stand alone server that integrates with CUCM via the CUCM -> System-> Application Server Option.
    2) If it's a plug-in then how would the upgrade be done from the CUPS Stand Alone to this IMP that runs within the CUCM Server or would it be a fresh installation or some activation of services within CUCM... and if it's a stand alone server then what Upgrade Image do I need to upgrade CUPS 8.6.4 to IMP 10.0 or 10.5.
    I'm sorry but the upgrade path for CUPS 8.6.4 to IMP 10.0/10.5 is not clear for me, could you help me understand the migration for CUPS 8.6.4 to IMP 10.0/10.5..... could you as well advise any documentation that shows detailed steps for the upgrade from CUPS 8.6.4 to IMP 10.0/10.5... the CUCM upgrade is clear to me from the Upgrade Guide.
    Thank you
    Aamer

Maybe you are looking for