Problem while Migrating user data from 10g to 11gR2

Hi experts,
I am trying to Migrate users data(including password and security questions) from 10g to 11gR2 what approach i have followed is..
From 10g using API i retrieved users data including password and security questions and i stored all information into hashmap. This is one java program.
And then i am trying to create that user in 11gR2 using API which i retrieved from 10g . From this 11g program i am creating object of 10g and i am using that hash map to retrieve user information.But i am not getting connection to 10g , it is throwing exception like unknown application server.Both sides i used API only as it is recommended to use API instead of JDBC connection.
Help me in this regard ASAP and suggest if there is any other approach to Migrate users data.
Thanks in Advance

By using Trusted Recon, you won't be able to Fetch Password as it is.
Since your goal is to fetch passwords too, please follow another approach.
You won't be able to get connection to both 10g and 11g simultaneously in the same program.
So, break this task in 2 phases. First connect with 10g, fetch user data in CSV format and then connect with 11G and read this CSV to create users.
Once users are created properly, use APIs for creating challenge questions and answers.
I think, you are getting exception like unknown application server because you are trying to connect to both 10g and 11g environments simultaneously.
Follow the following steps:-
(1) By using 10G APIsyou can't obtain password of user profile in decrypted form. So, Fetch password by using tcDataProvider. It will give you plain text password.
(2) In a custom scheduler written in 10g, retrieve this data in CSV. After all you can't store this info in
String query = "SELECT USR_LOGIN, USR_PASSWORD, USR_FIRST_NAME, USR_LAST_NAME FROM USR";//Add all fields which you want to retrieve from your 10G
(3) Use this query, tcDataProvider, tcDataSet and Java I/O (or any other CSV Third Party tool like the ones obtained in csv.jar in XL_HOME/ext folder) fetch this info in a CSV.
(4) Once CSV is generated, 10g machine is no more needed. Connect with 11g using 11g APIs. Write your custom 11G scheduler in order to read this CSV and use 11g APIs and create users for each record.
(5) Once user records are created in 11g, the difficult part is done. Transfer the Security questions too by using this CSV technique.
Please share results with us.

Similar Messages

  • Problem while exporting the data from a report to an excel file.

    Hi SAP guru's,
    I have a problem while exporting the data from a report to an excel file.
    The problem is that after exporting, the excel file seems to have some irrelevant characters....I am checking this using SOST transaction..
    Required text (Russian):
    Операции по счету                                    
    № документа     Тип документа     № учетной записи     Дата документа     Валюта     Сумма, вкл. НДС     Срок оплаты     Описание документа
    Current Text :
       ? 5 @ 0 F 8 8  ? >  A G 5 B C                                   
    !   4 > : C       "" 8 ?  4 > : C      !   C G 5 B = > 9  7 0 ? 8 A 8        0 B 0  4 > : C         0 ; N B 0      ! C <       ! @ > :  > ? ; 0 B K        ? 8 A 0 = 8 5  4 > : C
    Can you help me making configuration settings if any?
    Regards,
    Avinash Raju

    Hi Avinash
    To download  such characteres you need to adjust code page to be used during export. You can review SAP note 73606 to identify which code page is required for this language
    Best regards

  • Facing a Problem while downloading the data from ALV Grid to Excel Sheet

    Hi Friends,
    Iam facing a problem while downloading the data from ALV Grid to excel sheet. This is working fine in Development server , when comes to Quality and Production servers I have this trouble.
       I have nearly 11 fields in ALV Grid and out of which one is PO number of length 10 , all the ten numbers are visible in the excel sheet if we download it from development server but when we download it from Quality or Production it is showing only 9 numbers.
    Can any one help me out in this case.

    hi...
    if this problems happens dont display the same internal as u finally got.
    just create new internal table without calling any standard data elements and domains... but the new internal table s similar like ur final internal table and move all the values to new int table.
    for eg.
    ur final internal int table for disp,
         data : begin of itab occur 0,
                        matnr like mara-matnr,
                   end of itab.
    create new like this,
               data : begin of itab occur 0,
                        matnr(12) type N,
                   end of itab.

  • Encoding problem while reading binary data from MQ-series

    Dear all,
    we are running on 7.0 and we have an encoding problem while reading binary data from MQ-series. Because we are getting flat strings from queue we use module "Plain2ML" (MessageTransformBean) for wrapping xml-elements around the incoming data.
    The MQ-Series-Server is using CCSID 850, which we configured in connection parameters in communication channel (both parameters for Queuemanager CCSID and also CCSID of target).If there are special characters in the message (which HEX-values differ from codepage to codepage) we get errors in our adapter while executing, please see stack-trace for further analysis below.
    It seems to us that
    1. method ByteToCharUTF8.convert() expects UTF-8 in binary data
    2. Both CCSID parameters are not used anyway in JMS-adapter
    How can we solve this problem without changing anything on MQ-site?
    Here is the stack-trace:
    Catching com.sap.aii.af.mp.module.ModuleException: Transform: failed to execute the transformation: com.sap.aii.messaging.adapter.trans.TransformException: Error converting Message: 'sun.io.MalformedInputException'; nested exception caused by: sun.io.MalformedInputException caused by: com.sap.aii.messaging.adapter.trans.TransformException: Error converting Message: 'sun.io.MalformedInputException'; nested exception caused by: sun.io.MalformedInputException
        at com.sap.aii.af.modules.trans.MessageTransformBean.throwModuleException(MessageTransformBean.java:453)
        at com.sap.aii.af.modules.trans.MessageTransformBean.process(MessageTransformBean.java:387)
        at com.sap.aii.af.mp.module.ModuleLocalLocalObjectImpl0_0.process(ModuleLocalLocalObjectImpl0_0.java:103)
        at com.sap.aii.af.mp.ejb.ModuleProcessorBean.process(ModuleProcessorBean.java:292)
        at com.sap.aii.af.mp.processor.ModuleProcessorLocalLocalObjectImpl0_0.process(ModuleProcessorLocalLocalObjectImpl0_0.java:103)
        at com.sap.aii.adapter.jms.core.channel.filter.SendToModuleProcessorFilter.filter(SendToModuleProcessorFilter.java:84)
        at com.sap.aii.adapter.jms.core.channel.filter.MessageFilterContextImpl.callNext(MessageFilterContextImpl.java:195)
        at com.sap.aii.adapter.jms.core.channel.filter.ConvertBinaryToXiMessageFilter.filter(ConvertBinaryToXiMessageFilter.java:304)
        at com.sap.aii.adapter.jms.core.channel.filter.MessageFilterContextImpl.callNext(MessageFilterContextImpl.java:195)
        at com.sap.aii.adapter.jms.core.channel.filter.ConvertJmsMessageToBinaryFilter.filter(ConvertJmsMessageToBinaryFilter.java:112)
        at com.sap.aii.adapter.jms.core.channel.filter.MessageFilterContextImpl.callNext(MessageFilterContextImpl.java:195)
        at com.sap.aii.adapter.jms.core.channel.filter.InboundDuplicateCheckFilter.filter(InboundDuplicateCheckFilter.java:87)
        at com.sap.aii.adapter.jms.core.channel.filter.MessageFilterContextImpl.callNext(MessageFilterContextImpl.java:195)
        at com.sap.aii.adapter.jms.core.channel.filter.TxManagerFilter.filterSend(TxManagerFilter.java:123)
        at com.sap.aii.adapter.jms.core.channel.filter.TxManagerFilter.filter(TxManagerFilter.java:59)
        at com.sap.aii.adapter.jms.core.channel.filter.MessageFilterContextImpl.callNext(MessageFilterContextImpl.java:195)
        at com.sap.aii.adapter.jms.core.channel.filter.DynamicConfigurationFilter.filter(DynamicConfigurationFilter.java:72)
        at com.sap.aii.adapter.jms.core.channel.filter.MessageFilterContextImpl.callNext(MessageFilterContextImpl.java:195)
        at com.sap.aii.adapter.jms.core.channel.filter.PmiAgentFilter.filter(PmiAgentFilter.java:66)
        at com.sap.aii.adapter.jms.core.channel.filter.MessageFilterContextImpl.callNext(MessageFilterContextImpl.java:195)
        at com.sap.aii.adapter.jms.core.channel.filter.InboundCorrelationFilter.filter(InboundCorrelationFilter.java:60)
        at com.sap.aii.adapter.jms.core.channel.filter.MessageFilterContextImpl.callNext(MessageFilterContextImpl.java:195)
        at com.sap.aii.adapter.jms.core.channel.filter.JmsHeadersProfileFilter.filter(JmsHeadersProfileFilter.java:59)
        at com.sap.aii.adapter.jms.core.channel.filter.MessageFilterContextImpl.callNext(MessageFilterContextImpl.java:195)
        at com.sap.aii.adapter.jms.core.channel.filter.MessageInvocationsFilter.filter(MessageInvocationsFilter.java:89)
        at com.sap.aii.adapter.jms.core.channel.filter.MessageFilterContextImpl.callNext(MessageFilterContextImpl.java:195)
        at com.sap.aii.adapter.jms.core.channel.filter.JarmMonitorFilter.filter(JarmMonitorFilter.java:57)
        at com.sap.aii.adapter.jms.core.channel.filter.MessageFilterContextImpl.callNext(MessageFilterContextImpl.java:195)
        at com.sap.aii.adapter.jms.core.channel.filter.ThreadNamingFilter.filter(ThreadNamingFilter.java:62)
        at com.sap.aii.adapter.jms.core.channel.filter.MessageFilterContextImpl.callNext(MessageFilterContextImpl.java:195)
        at com.sap.aii.adapter.jms.core.channel.SenderChannelImpl.doReceive(SenderChannelImpl.java:263)
        at com.sap.aii.adapter.jms.core.channel.ChannelImpl.receive(ChannelImpl.java:437)
        at com.sap.aii.adapter.jms.core.connector.MessageListenerImpl.onMessage(MessageListenerImpl.java:36)
        at com.ibm.mq.jms.MQMessageConsumer$FacadeMessageListener.onMessage(MQMessageConsumer.java:399)
        at com.ibm.msg.client.jms.internal.JmsMessageConsumerImpl$JmsProviderMessageListener.onMessage(JmsMessageConsumerImpl.java:904)
        at com.ibm.msg.client.wmq.v6.jms.internal.MQMessageConsumer.receiveAsync(MQMessageConsumer.java:4249)
        at com.ibm.msg.client.wmq.v6.jms.internal.SessionAsyncHelper.run(SessionAsyncHelper.java:537)
        at java.lang.Thread.run(Thread.java:770)
    Caused by: com.sap.aii.messaging.adapter.trans.TransformException: Error converting Message: 'sun.io.MalformedInputException'; nested exception caused by: sun.io.MalformedInputException
        at com.sap.aii.messaging.adapter.Conversion.service(Conversion.java:714)
        at com.sap.aii.af.modules.trans.MessageTransformBean.processTransform(MessageTransformBean.java:538)
        at com.sap.aii.af.modules.trans.MessageTransformBean.processTransform(MessageTransformBean.java:528)
        at com.sap.aii.af.modules.trans.MessageTransformBean.processTransform(MessageTransformBean.java:471)
        at com.sap.aii.af.modules.trans.MessageTransformBean.process(MessageTransformBean.java:364)
        ... 36 more
    Caused by: sun.io.MalformedInputException
        at sun.io.ByteToCharUTF8.convert(ByteToCharUTF8.java:270)
        at sun.nio.cs.StreamDecoder$ConverterSD.convertInto(StreamDecoder.java:287)
        at sun.nio.cs.StreamDecoder$ConverterSD.implRead(StreamDecoder.java:337)
        at sun.nio.cs.StreamDecoder.read(StreamDecoder.java:223)
        at java.io.InputStreamReader.read(InputStreamReader.java:208)
        at java.io.BufferedReader.fill(BufferedReader.java:153)
        at java.io.BufferedReader.readLine(BufferedReader.java:316)
        at java.io.LineNumberReader.readLine(LineNumberReader.java:176)
        at com.sap.aii.messaging.adapter.Conversion.convertPlain2XML(Conversion.java:310)
        at com.sap.aii.messaging.adapter.Conversion.service(Conversion.java:709)
        ... 40 more
    Any ideas?
    Kind regards, Stefan

    Hi Stefan,
    for the first MTB now we are using only one parameter: Transform.ContentType = text/plain;charset="ISO-8859-1"
    The second MTB, which does the XML-Wrapping, is configured like this:
    Transform.Class = com.sap.aii.messaging.adapter.Conversion
    Transform.ContentType = application/xml
    xml.conversionType = SimplePlain2XML
    xml.fieldNames = value
    xml.fieldSeparator = §%zulu§%
    xml.processFieldNames = fromConfiguration
    xml.structureTitle = payload
    Both CCSID configuration parameters from the "Source"-Tab we've set to 850.
    Now, we don't get an error anymore - sun.io.malformedInputException - , but, unfortunately, now special character conversion succeeded (we need an "ß" and we get an ISO-HEX-E1 -> á).  E1 is (different from ISO) an "ß" in 850.
    Any ideas?

  • Problems while migrating PDK applications from Portal 7.1 to Portal 7.3

    Hi All,
    I am facing a problem while migrating a PDK application from Portal 7.1 to Portal 7.3.
    Since Portal 7.3 doesnt support PAR files any more it provided with a tool to convert the PAR to an EAR file and deploy the resultant EAR to the new portal.
    I converted the PAR into EAR and deployed the file but the application is not executing. When I looked into the log files i found the following exception. The application is not able to identify the tag "documentBody". I checked the protalapp.xml and the "SharingReference" is properly maintained in the file but the error message suggests that its not able to fine the reference to the tagLib uri maintained in the portalapp.xml
    Now my question is why is it not able to obtain a reference to the taglib? Did any one face similar problem previously. Can you please provide me with any document for migrating / developing PDK application for Portal 7.3
    Parser [PRT] - JspParseException is thrown while parsing JSP file </SAP/sap/POD/J00/j2ee/cluster/apps/TestOrg.co.uk/TestOrg~cas~ptlsvc/servlet_jsp/CASPortalService/root/WEB-INF/pagelet/CASFramework.jsp> :
    com.sap.engine.services.servlets_jsp.jspparser_api.exception.JspParseException: Cannot parse custom tag with short name [documentBody].
    at com.sap.engine.services.servlets_jsp.jspparser_api.jspparser.syntax.JspElement.customTagAction(JspElement.java:969)
    at com.sap.engine.services.servlets_jsp.jspparser_api.jspparser.syntax.JspElement.action(JspElement.java:228)
    at com.sap.engine.services.servlets_jsp.jspparser_api.jspparser.syntax.ElementCollection.action(ElementCollection.java:59)
    at com.sap.engine.services.servlets_jsp.jspparser_api.jspparser.syntax.JspElement.customTagAction(JspElement.java:994)
    at com.sap.engine.services.servlets_jsp.jspparser_api.jspparser.syntax.JspElement.action(JspElement.java:228)
    at com.sap.engine.services.servlets_jsp.jspparser_api.jspparser.syntax.ElementCollection.action(ElementCollection.java:59)
    at com.sap.engine.services.servlets_jsp.jspparser_api.jspparser.syntax.JspElement.customTagAction(JspElement.java:994)
    at com.sap.engine.services.servlets_jsp.jspparser_api.jspparser.syntax.JspElement.action(JspElement.java:228)
    at com.sap.engine.services.servlets_jsp.jspparser_api.jspparser.syntax.ElementCollection.action(ElementCollection.java:59)
    at com.sap.engine.services.servlets_jsp.jspparser_api.jspparser.syntax.ElementCollection.action(ElementCollection.java:69)
    at com.sap.engine.services.servlets_jsp.jspparser_api.jspparser.GenerateJavaFile.generateJavaFile(GenerateJavaFile.java:72)
    at com.sap.engine.services.servlets_jsp.server.jsp.JSPProcessor.parse(JSPProcessor.java:272)
    at com.sap.engine.services.servlets_jsp.server.jsp.JSPProcessor.generateJavaFile(JSPProcessor.java:196)
    at com.sap.engine.services.servlets_jsp.server.jsp.JSPProcessor.parse(JSPProcessor.java:128)
    at com.sap.engine.services.servlets_jsp.jspparser_api.JSPChecker.getClassName(JSPChecker.java:377)
    at com.sap.engine.services.servlets_jsp.jspparser_api.JSPChecker.compileAndGetClassName(JSPChecker.java:306)
    at com.sap.engine.services.servlets_jsp.jspparser_api.JSPChecker.getClassNameForProduction(JSPChecker.java:236)
    Thanks in advance
    PK

    Hi Amit,
    Thanks for your reply. I could see some progress after "thoroughly" going through those links. Thanks a lot (gave points too )
    Now I am facing another problem. The application did deploy successfully but with a warning. When i checked the warning it says that the deployment was successful but can not be started.
    The log file has the following error message.
    Global [startApp] operation of application [newsint.co.uk/newsint~cas~ptlsvc] finished with errors for [5] ms on server process [2721950] [
    >>> Errors <<<
    1). com.sap.engine.services.deploy.exceptions.ServerDeploymentException: Application newsint.co.uk/newsint~cas~ptlsvc cannot be started. Reason: it has hard reference to the following resources, which are currently not available on the system: hard to SAPPORTAL/com.sapportals.htmlb (public) (f=true, cl=false); .
    In order to start the application, components that provide the missing resources should be successfully deployed.
    at com.sap.engine.services.deploy.server.ReferenceResolver.isResolved(ReferenceResolver.java:137)
    at com.sap.engine.services.deploy.server.LifecycleController.startReferencedComponents(LifecycleController.java:173)
    at com.sap.engine.services.deploy.server.application.StartTransaction.beginCommon(StartTransaction.java:200)
    at com.sap.engine.services.deploy.server.application.StartTransaction.begin(StartTransaction.java:166)
    at com.sap.engine.services.deploy.server.application.ApplicationTransaction.makeAllPhasesOnOneServer(ApplicationTransaction.java:421)
    at com.sap.engine.services.deploy.server.application.ParallelAdapter.makeAllPhases(ParallelAdapter.java:465)
    at com.sap.engine.services.deploy.server.application.StartTransaction.makeAllPhases(StartTransaction.java:605)
    at com.sap.engine.services.deploy.server.DeployServiceImpl.makeGlobalTransaction(DeployServiceImpl.java:1828)
    I dont understand why is SAPPORTAL/com.sapportals.htmlb is not available. As per my understanding it is some thing that should be available with the portal server by default.
    Any hints from any one?
    thanks in advance
    PK

  • Help with migrating user data from one account to another

    I am on a G5 with 10.4.11. I installed FCP studio 6 and everything works fine except compressor. I have had only one user account on this mac (i will call it account 1). I was told to create another account (account 2), log on and see if compressor works. It does. Basically there are files missing on account 1 and compressor won't run. I have installed 3 times and no change.
    Anyway I was told to migrate all my user data from account 1 to account 2. I have been looking at threads and want to make sure I am doing the right thing. I found a way to close. Please see if this works. IT may be from the tiger forum:
    pick a short user name for your new account. then run the following terminal commands
    sudo mkdir /users/newshortname
    sudo ditto ~ /users/newshortname
    You'll have to enter your admin password (which you won't see) after the first command. that's normal. This will copy your current home directory to /users/newshortname. Then go to system preferences->accounts and create a new account with the short name newshortname. You'll get a popup saying that a home directory by that name already exists and asking if you want to use it. say yes.
    Anyway, I would like to migrate everything from account 1 to 2 and have all preferences...everything. I would also like to be able to delete the old account...account 1 and rename account 2. Since this is a home MAC, I really don't need it PW protected. I would like this MAC to just use my new account 2 at start up.
    Please ask me any questions. This is my editing MAC, and I can't get around inside the OS. I know FCP like the back of my hand, but don't want to screw up this computer.
    Thanks in advance.
    Message was edited by: Nelson May

    Yeah, you should be able to move most of it by changing permissions.
    Network settings could be any/all of these though for instance...
    /Users/YourUserName/Library/Preferences/ByHost/com.apple.networkConnect.<12 digit number>.plist
    /Users/YourUserName/Library/Preferences/com.apple.internetconnect.plist
    /Library/Preferences/SystemConfiguration/preferences.plist
    /Users/YourUserName/Library/Preferences/com.apple.systempreferences.plist
    /Library/Preferences/SystemConfiguration/com.apple.airport.preferences.plist
    /Library/Preferences/com.apple.sharing.firewall.plist
    /Library/Preferences/SystemConfiguration/NetworkInterfaces.plist
    /Library/Preferences/com.apple.networkConfig.plist
    /Library/Preferences/SystemConfiguration/com.apple.nat.plist
    /Library/Preferences/com.apple.print.FaxPrefs.plist
    As well as the old Keychain & cookies from your Browser.

  • Migrate user data from SLES10sp3 OES2sp3 to new hardware

    Hi,
    I am trying to migrate user data on a nss volume from a SLES10sp3 and
    OES2sp3 server to new hardware already setup with SLES10sp3 and OES2sp3.
    Authentication and configuration of the data selection is sorted. After
    starting the data transfer I get a message saying the migration failed.
    Below is the error from the log file:
    FATAL -FILESYSTEM:migfiles:Fatal: Cannot migrate source path
    /media/nss/DATA/Users/JohnG/: nbackup: Failed to open dataset
    /media/nss/DATA/Users/JohnG/ for backup
    WARNING - FILESYSTEM:migfiles:Warning: Going to next data set
    ERROR - FILESYSTEM:migfiles:Error: nbackup: (libtsafs.so 6.50.0 291) An
    invalid path was used.
    Regards
    John

    Hi kjhurni,
    I have not setup to get update from the Novell channel as I have an issue
    with connecting to the update channel but I will deal with that later.
    I tried to run a miggui to migrate my data to the new server with SLES10sp3
    and OES2sp3. Below is the entire log file, I have highlighted the error at
    the end of the log file. (Names and addresses have been changed to protect
    the innocent)
    Regards
    John
    2012-01-12 11:15:17,394 INFO - Migration
    Framework:getSourceServerConfigDetails():Finding Source Server eDirectory
    Tree Name using command: ndsstat -h SERVER1
    2012-01-12 11:15:17,474 INFO - MigUtil:ResolvIP:Resolving IP address from
    hostname SERVER1
    2012-01-12 11:15:17,539 INFO - Migration Framework:Getting Source Server
    certificate:Using 10.1.3.1 to retrieve Source Server certificate
    2012-01-12 11:15:17,539 INFO - Migration Framework:Getting Source Server
    certificate:executing command: LC_ALL=en_US.UTF-8 /usr/bin/ruby
    /opt/novell/migration/sbin/getServerCert.rb --server-ip "10.1.3.1"
    --ldap-port 636 --file /opt/novell/migration/plugin/conf/SourceServerCert
    2012-01-12 11:15:20,795 INFO - Migration Framework:Getting Source tree
    certificate:using getSSCert command
    2012-01-12 11:15:20,801 INFO - Migration Framework:Getting Source tree
    certificate:Using 10.1.3.1 to retrieve Source tree certificate
    2012-01-12 11:15:20,801 INFO - Migration Framework:Getting Source tree
    certificate:executing command: LC_ALL=en_US.UTF-8
    /opt/novell/oes-install/util/getSSCert -a "10.1.3.1" -t "TREE_NAME" -u
    "admin.b"
    2012-01-12 11:15:21,087 INFO - Migration
    Framework:MigController:getSourceServerTreeCert(): :Copied certificate to
    /opt/novell/migration/plugin/conf/SourceCert.der
    2012-01-12 11:15:21,127 INFO - Migration
    Framework:importSourceTreeCert():Using 10.1.3.1 to import Source Tree
    certificate
    2012-01-12 11:15:21,127 INFO - Migration Framework:Importing Source Tree
    certificate:Importing certificate to Java keystore
    (/opt/novell/migration/plugin/conf/novellStore.keystore)
    2012-01-12 11:15:21,911 INFO - Migration
    Framework:importSourceTreeCert()eleting existing source tree certificate
    using command: /usr/bin/keytool -delete -noprompt -alias 10.1.3.1 -keystore
    /opt/novell/migration/plugin/conf/novellStore.keystore
    2012-01-12 11:15:21,911 INFO - Migration Framework:importSourceTreeCert():
    command: /usr/bin/keytool -import -alias 10.1.3.1 -file
    /opt/novell/migration/plugin/conf/SourceCert.der -noprompt -keystore
    /opt/novell/migration/plugin/conf/novellStore.keystore
    2012-01-12 11:15:22,577 INFO - Migration Framework:SourceServer
    Authentication:Attempting to Connect to Source with the parameters
    2012-01-12 11:15:22,587 INFO - Migration Framework:SourceServer
    Authentication:Authenticating using LDAP(secure)
    2012-01-12 11:15:23,263 INFO - Migration Framework:SourceServer
    Authentication:Authenticating to the Source Server returned true
    2012-01-12 11:15:23,270 INFO - Migration Framework:SourceServer
    Authenticationetecting the source server version
    2012-01-12 11:15:23,315 INFO - Migration Framework:Authentication:Using SMS
    to validate root password for server 10.1.3.1
    2012-01-12 11:15:23,315 INFO - MigUtils:Authentication:Executing
    command:/opt/novell/sms/bin/nbackup -U root -R 10.1.3.1 -cf
    /var/opt/novell/migration/NewProj5/migsrc.sidf /etc/SuSE-release
    2012-01-12 11:15:37,876 INFO - Migration
    Framework:getTargetServerConfigDetails():Finding Target Server eDirectoty
    Tree Name using command: ndsstat -h 10.1.3.2
    2012-01-12 11:15:37,930 INFO - Migration Framework:Getting Target Server
    certificate:Using 10.1.3.2 to retrieve Target Server certificate
    2012-01-12 11:15:37,930 INFO - Migration Framework:Getting Target Server
    certificate:executing command: LC_ALL=en_US.UTF-8 /usr/bin/ruby
    /opt/novell/migration/sbin/getServerCert.rb --server-ip "10.1.3.2"
    --ldap-port 636 --file /opt/novell/migration/plugin/conf/TargetServerCert
    2012-01-12 11:15:40,115 INFO - Migration
    Framework:getTargetTreeCertificate():Attempting to connect to Target with
    the parameters
    2012-01-12 11:15:40,116 INFO - Migration
    Framework:getTargetTreeCertificate():Copy certificate to
    /opt/novell/migration/plugin/conf/TargetCert.der for CLI usage
    2012-01-12 11:15:40,122 INFO - Migration Framework:Importing Target Tree
    certificate:Attempting to connect to Target with the parameters
    2012-01-12 11:15:40,122 INFO - Migration Framework:Importing Target Tree
    certificate:Importing certificate to Java keystore
    (/opt/novell/migration/plugin/conf/novellStore.keystore)
    2012-01-12 11:15:40,122 INFO - Migration
    Framework:importTargetTreeCert():keytool delete command to be executed to
    delete target tree certificate
    2012-01-12 11:15:40,122 INFO - Migration
    Framework:importTargetTreeCert():Executing command : /usr/bin/keytool
    -delete -noprompt -alias 10.1.3.2 -keystore
    /opt/novell/migration/plugin/conf/novellStore.keystore
    2012-01-12 11:15:40,788 INFO - Migration
    Framework:importTargetTreeCert():Using keytool to delete existing
    alias10.1.3.2
    2012-01-12 11:15:40,789 INFO - Migration
    Framework:importTargetTreeCert():keytool command to be executed to store
    target tree certificate
    2012-01-12 11:15:41,486 INFO - Migration
    Framework:importTargetTreeCert():Return Value true
    2012-01-12 11:15:41,486 INFO - Migration Framework:TargetServer
    Authentication:Attempting to connect to Target with the parameters
    2012-01-12 11:15:41,487 INFO - Migration Framework:TargetServer
    Authentication:Authenticating to the Target Server(secure)
    2012-01-12 11:15:41,550 INFO - Migration Framework:Authentication:Using SMS
    to validate root password for server 10.1.3.2
    2012-01-12 11:15:41,550 INFO - MigUtils:Authentication:Executing
    command:/opt/novell/sms/bin/nbackup -U root -l /etc/SuSE-release
    2012-01-12 11:16:35,254 INFO - FILESYSTEM:migfiles:Validating data from
    source path(s) to target path(s) using migfiles
    2012-01-12 11:16:35,254 INFO - FILESYSTEM:migfiles:Executing command:
    /opt/novell/migration/sbin/migfiles --source-server 10.1.3.1 --verbose
    --source-path "@/var/opt/novell/migration/NewProj5/fs/sourcepaths.in"
    --destination-full-path
    "@/var/opt/novell/migration/NewProj5/fs/targetpaths.in" --precheck
    --progress --progress-interval 1 --source-ldap-port 636
    --delete-file-on-restore-error
    2012-01-12 11:16:35,618 INFO - FILESYSTEM:migfiles:Enter the username (e.g
    cn=admin,o=mycompany)
    2012-01-12 11:16:35,618 INFO - FILESYSTEM:migfiles:for the server 10.1.3.1:
    Enter cn=admin,o=b password:
    2012-01-12 11:16:35,618 INFO - FILESYSTEM:migfiles:Enter the username (e.g
    root)
    2012-01-12 11:16:35,619 INFO - FILESYSTEM:migfiles:for the server 10.1.3.2:
    Enter root password:
    2012-01-12 11:16:39,225 INFO - FILESYSTEM:migfiles:Information: Validation
    for migfiles is done successfully
    2012-01-12 11:16:39,230 INFO - FILESYSTEM:migfiles:
    2012-01-12 11:16:39,230 INFO - FILESYSTEM:migfiles:Successfully validated
    data
    2012-01-12 11:16:39,245 INFO -
    FILESYSTEM:Information:****************Migrating files and/or
    trustees****************
    2012-01-12 11:16:39,246 INFO - FILESYSTEM:migfiles:Migrating data from
    source path(s) to target path(s) using migfiles
    2012-01-12 11:16:39,246 INFO - FILESYSTEM:migfiles:Executing command:
    /opt/novell/migration/sbin/migfiles --source-server 10.1.3.1 --verbose
    --source-path "@/var/opt/novell/migration/NewProj5/fs/sourcepaths.in"
    --destination-full-path
    "@/var/opt/novell/migration/NewProj5/fs/targetpaths.in" --session-file
    "/var/opt/novell/migration/NewProj5/fs/migfiles.session" --progress
    --progress-interval 1 --source-ldap-port 636
    --delete-file-on-restore-error
    2012-01-12 11:16:39,534 INFO - FILESYSTEM:migfiles:Enter the username (e.g
    cn=admin,o=mycompany)
    2012-01-12 11:16:39,534 INFO - FILESYSTEM:migfiles:for the server 10.1.3.1:
    Enter cn=admin,o=b password:
    2012-01-12 11:16:39,534 INFO - FILESYSTEM:migfiles:Enter the username (e.g
    root)
    2012-01-12 11:16:39,534 INFO - FILESYSTEM:migfiles:for the server 10.1.3.2:
    Enter root password:
    2012-01-12 11:16:45,078 INFO - FILESYSTEM:migfiles:Information: nbackup:
    precomputing for displaying progress...
    2012-01-12 11:16:45,157 FATAL - FILESYSTEM:migfiles:Fatal: Cannot migrate
    source path /media/nss/DATA/Users/JohnG/: nbackup: Failed to open dataset
    /media/nss/DATA/Users/JohnG/ for backup
    2012-01-12 11:16:45,167 WARN - FILESYSTEM:migfiles:Warning: Going to next
    data set
    2012-01-12 11:16:45,168 ERROR - FILESYSTEM:migfiles:Error: nbackup:
    (libtsafs.so 6.50.0 291) An invalid path was used.
    2012-01-12 11:16:45,242 FATAL - FILESYSTEM:migfiles:Fatal: nbackup command
    failed to complete
    2012-01-12 11:16:45,246 INFO - FILESYSTEM:migfiles:
    2012-01-12 11:16:45,247 ERROR - FILESYSTEM:migfiles:migfiles command
    failed!
    2012-01-12 11:16:45,247 INFO -
    FILESYSTEM:Information:****************End******** ********

  • Migrating Users & Data from a Powerbook G4 to a MBP

    OK, I'm getting a refurb. MBP and I'm going to need to transfer over my stuff from the Powerbook G4 to the MBP. The thing is...both my 400 and 800 FW ports in the Powerbook have ceased to function, so Target Disk Mode will not work for me. I could use the Ethernet form of transfer to get everything over but that implies that the MBP will have to be up and running and have it's network settings all configured in order for it to be "seen" on the network. Is this the case? Or, upon startup of the MacBoook Pro, will I be prompted to migrate data & users via Ethernet?
    In the past, I've been able to use Target Disk Mode and I'm familiar with that procedure, I just don't know what to expect via an Ethernet migration. Can anyone enlighten me?
    Thanks

    A Basic Guide for Migrating to Intel-Macs
    If you are migrating a PowerPC system (G3, G4, or G5) to an Intel-Mac be careful what you migrate. Keep in mind that some items that may get transferred will not work on Intel machines and may end up causing your computer's operating system to malfunction.
    Rosetta supports "software that runs on the PowerPC G3, G4, or G5 processor that are built for Mac OS X". This excludes the items that are not universal binaries or simply will not work in Rosetta:
    Classic Environment, and subsequently any Mac OS 9 or earlier applications
    Screensavers written for the PowerPC
    System Preference add-ons
    All Unsanity Haxies
    Browser and other plug-ins
    Contextual Menu Items
    Applications which specifically require the PowerPC G5
    Kernel extensions
    Java applications with JNI (PowerPC) libraries
    See also What Can Be Translated by Rosetta.
    In addition to the above you could also have problems with migrated cache files and/or cache files containing code that is incompatible.
    If you migrate a user folder that contains any of these items, you may find that your Intel-Mac is malfunctioning. It would be wise to take care when migrating your systems from a PowerPC platform to an Intel-Mac platform to assure that you do not migrate these incompatible items.
    If you have problems with applications not working, then completely uninstall said application and reinstall it from scratch. Take great care with Java applications and Java-based Peer-to-Peer applications. Many Java apps will not work on Intel-Macs as they are currently compiled. As of this time Limewire, Cabos, and Acquisition are available as universal binaries. Do not install browser plug-ins such as Flash or Shockwave from downloaded installers unless they are universal binaries. The version of OS X installed on your Intel-Mac comes with special compatible versions of Flash and Shockwave plug-ins for use with your browser.
    The same problem will exist for any hardware drivers such as mouse software unless the drivers have been compiled as universal binaries. For third-party mice the current choices are USB Overdrive or SteerMouse. Contact the developer or manufacturer of your third-party mouse software to find out when a universal binary version will be available.
    Also be careful with some backup utilities and third-party disk repair utilities. Disk Warrior 4.1, TechTool Pro 4.6.1, SuperDuper 2.5, and Drive Genius 2.0.2 work properly on Intel-Macs with Leopard. The same caution may apply to the many "maintenance" utilities that have not yet been converted to universal binaries. Leopard Cache Cleaner, Onyx, TinkerTool System, and Cocktail are now compatible with Leopard.
    Before migrating or installing software on your Intel-Mac check MacFixit's Rosetta Compatibility Index.
    Additional links that will be helpful to new Intel-Mac users:
    Intel In Macs
    Apple Guide to Universal Applications
    MacInTouch List of Compatible Universal Binaries
    MacInTouch List of Rosetta Compatible Applications
    MacUpdate List of Intel-Compatible Software
    Transferring data with Setup Assistant - Migration Assistant FAQ
    Because Migration Assistant isn't the ideal way to migrate from PowerPC to Intel Macs, using Target Disk Mode, copying the critical contents to CD and DVD, an external hard drive, or networking
    will work better when moving from PowerPC to Intel Macs. The initial section below discusses Target Disk Mode. It is then followed by a section which discusses networking with Macs that lack Firewire.
    If both computers support the use of Firewire then you can use the following instructions:
    1. Repair the hard drive and permissions using Disk Utility.
    2. Backup your data. This is vitally important in case you make a mistake or there's some other problem.
    3. Connect a Firewire cable between your old Mac and your new Intel Mac.
    4. Startup your old Mac in Target Disk Mode.
    5. Startup your new Mac for the first time, go through the setup and registration screens, but do NOT migrate data over. Get to your desktop on the new Mac without migrating any new data over.
    If you are not able to use a Firewire connection (for example you have a Late 2008 MacBook that only supports USB:)
    1. Set up a local home network: Creating a small Ethernet Network.
    2. If you have a MacBook Air or Late 2008 MacBook see the following:
    MacBook (13-inch, Aluminum, Late 2008) and MacBook Pro (15-inch, Late 2008)- Migration Tips and Tricks;
    MacBook (13-inch, Aluminum, Late 2008) and MacBook Pro (15-inch, Late 2008)- What to do if migration is unsuccessful;
    MacBook Air- Migration Tips and Tricks;
    MacBook Air- Remote Disc, Migration, or Remote Install Mac OS X and wireless 802.11n networks.
    Copy the following items from your old Mac to the new Mac:
    In your /Home/ folder: Documents, Movies, Music, Pictures, and Sites folders.
    In your /Home/Library/ folder:
    /Home/Library/Application Support/AddressBook (copy the whole folder)
    /Home/Library/Application Support/iCal (copy the whole folder)
    Also in /Home/Library/Application Support (copy whatever else you need including folders for any third-party applications)
    /Home/Library/Keychains (copy the whole folder)
    /Home/Library/Mail (copy the whole folder)
    /Home/Library/Preferences/ (copy the whole folder)
    /Home /Library/Calendars (copy the whole folder)
    /Home /Library/iTunes (copy the whole folder)
    /Home /Library/Safari (copy the whole folder)
    If you want cookies:
    /Home/Library/Cookies/Cookies.plist
    /Home/Library/Application Support/WebFoundation/HTTPCookies.plist
    For Entourage users:
    Entourage is in /Home/Documents/Microsoft User Data
    Also in /Home/Library/Preferences/Microsoft
    Credit goes to Macjack for this information.
    If you need to transfer data for other applications please ask the vendor or ask in the Discussions where specific applications store their data.
    5. Once you have transferred what you need restart the new Mac and test to make sure the contents are there for each of the applications.
    Written by Kappy with additional contributions from a brody.
    Revised 1/6/2009
    Connect the two computers via FireWire to do any data transfer whether manually or with Migration Assistant.

  • Can anyone pls. help me : facing problem while fetching the data from BAPI

    Hi all,
        we have installed xMII in a new server. In this new server I am trying to fetch data from a BAPI & write it into a file thru a transaction, but i cant see the data in the tracer, & neither is the file created. But with the same configuration & connection am able to get the data in the old server. Can anyone pls. tell me wat could be the problem?? or is there anything else that we might have forgotten while installing xMII in the new server.
    ur help would be greatly appreciated.
    Thanks,
    Sushma.

    Hi Ravi,
           no am not able to see the table structure also. This is what it is showing in the tracer :
       [INFO ]: Execution Started At: 17:24:17
    [DEBUG]: 00000.03100 Begin Transaction 'TMP99A51958-5BAE-CDE0-0DB5-A3A8C72BC297'
    [DEBUG]: 00000.03100 Begin Sequence Sequence : ()
    [DEBUG]: 00000.03100 Begin Action SAPJCOInterface_0 : (SAP JCO Interface)
    [DEBUG]: 00006.43700 Connection Took 6406 mS
    [DEBUG]: 00009.82800 Function Creation Took 3391 mS
    [DEBUG]: 00010.25000 Execution Took 422 mS
    [DEBUG]: 00010.25000 End Action SAPJCOInterface_0 : (SAP JCO Interface)
    [DEBUG]: 00010.25000 Begin Sequence Sequence_0 : ()
    [DEBUG]: 00010.25000 Begin Action Repeater_0 : (Repeater)
    [DEBUG]: 00010.26600 End Action Repeater_0 : (Repeater)
    [DEBUG]: 00010.26600 End Sequence Sequence_0 : ()
    [DEBUG]: 00010.26600 End Sequence Sequence : ()
    [DEBUG]: 00010.26600 End Transaction 'TMP99A51958-5BAE-CDE0-0DB5-A3A8C72BC297'
    [INFO ]: Execution Completed At: 17:24:28 Elapsed Time was 10235 mS
    I doubt the repeater is not working, becoz i checked the JCO connection & thats fine..
    Thanks,
    Sushma.

  • Problem while migrating the extensions from one instance to the other.

    Hi,
    We are using kintana workbench to migrate the objects from one instance to the other. But after migrating From Instance1 to Intance2 it ws fine. Working properly in Instance2. But when the same package is migrated to Instance3, all are migrated successfully and log files are showing proerly. But when i access the pages in Instance3, I am getting the errors on the page above like this:
    ViewObject attribute is null; ViewUsageName: (ScorecardSummaryVO1); RegionItem: (LastUpdateLogin)
    Can some let me know what could be the problem.
    Thanks.

    user555006,
    Error shows that LastUpdateLogin item on the page has no proper VO attribute attached. Hence it seems that the MDS data is not updated properly. Upload the page again and test.
    --Shiv                                                                                                                                                                                                                                                                                                                                                                                                                       

  • Migrate user data from SLES10sp3 OES2sp3 to new hardware (again)

    Hi,
    I am running SLES10sp3_64 with OES2sp3_64 and need to migrate the user data
    to a new server running the same OS. I have also updated the SLES & OES to
    the maintenance pack of November 2011.
    The miggui authenticates and I can select the folders for migration. Below
    is an extract of the log file error:
    FATAL -FILESYSTEM:migfiles:Fatal: Cannot migrate source path
    /media/nss/DATA/Users/JohnG/: nbackup: Failed to open dataset
    /media/nss/DATA/Users/JohnG/ for backup
    WARN - FILESYSTEM:migfiles:Warning: Going to next data set
    ERROR - FILESYSTEM:migfiles:Error: nbackup: (libtsafs.so 6.50.0 291) An
    invalid path was used.
    FATAL - FILESYSTEM:migfiles:Fatal: nbackup command failed to complete
    INFO - FILESYSTEM:migfiles:
    ERROR - FILESYSTEM:migfiles command failed!
    Regards
    John

    Hi,
    The log is from /var/opt/novell/migration of the miggui. The begining of the log is about the connection and the authentication to the servers. Which is 100%
    I then also ran the "nbackup" command from a terminal with the same result. I don't think it matters, but I am migrating from-and-to NSS volumes.
    I have looked at the tid and have checked all relevent points and again, no issues. Perhaps this is a "NCP" issue rather than a "miggui" issue.
    Thanks again.
    John
    Originally Posted by kjhurni
    Which log file is this from? Looks like the filesystem log?
    I think there's a migration log and a debug log as well that may show some additional info.
    Have you looked at this TID (slightly old, but I think most of it still applies)
    Troubleshooting data migration issues with miggui

  • Problem in migrating huge data from mysql to oracle using OMWB.

    I'm using OMWB to migrate my mysql db to oracle db.Mysql db contains some 43 tables and all of them were copied to oracle db except a table "as_db" which contains some 503 mb of data.The table contains 3 columns and two of them were integer type and one is of medium blob.While migrating i'm not getting any error.After migrating i opened the table in the oracle.The table alone created but there were no datas.But all other table contains data.
    Then i generated the ctl file usiing SQL*loader for that particular table alone and tried to execute but it also fails by giving the Column size exceeds for blob.
    How i handle this issue?
    Thanks in Advance,
    Jai.M

    Looks like you may have to save you blobs out as files and load them individually thru the sqlldr process. I know this has normally been done thru raw to blob
    What way is your blob data being saved out in the extract data file.
    Barry

  • Problem While loading Master data from R/3

    Hi All,
    When i am trying to load masters data from R/3 datasources to BW Info objects in Info provider menu data is coming to PSA but going to dta targets. the data is matching with the data in R/ 3 but
    Missing message: Request received
    Missing message: Number of sent records
    Missing message: Selection completed
    Data Package 1 : arrived in BW ; Processing : Selected number does not agree with transferred n
    Info IDoc 1 : sent, not arrived ; IDoc ready for dispatch (ALE service)
    and error message says ERROR 4 in the update.
    Can you please help me in this regard . Its Urgent.

    Hi,
    Check for pending IDOCS in BD87 or SM58 and execute them.
    In RSMO, select the particular load you want to monitor.
    In the menu bar, Environment >>> Transact. RFC >>> Select whichever is required, BW or Source System.
    In the next screen you can select the Execute button and the IDOCS will be displayed.
    Refer this Tcodes for IDOC status.
    WE02 Display IDoc
    WE05 IDoc Lists
    WE06 Active IDoc monitoring
    WE07 IDoc statistics
    WE08 Status File Interface
    WE09 Search for IDoc in Database
    WE10 Search for IDoc in Archive
    WE23 Verification of IDoc processing
    IDOC
    www.erpgenie.com/sapgenie/docs/ale_whitepaper.doc
    Check Note 561880 - Requests hang because IDocs are not processed.
    OR
    Transact RFC error 
    tRFC Error - status running Yellow for long time (Transact RFC will be enabled in Status tab in RSMO).
    Step 1:  Goto Details, Status get the IDoc number,and go to BD87 in R/3,place the cursor in the RED IDoc entroes in tRFC
    queue thats under outbound processing and click on display the IDOC which is on the menu bar.
    Step 2: In the next screen click on Display tRFC calls (will take you to SM58 particular TRFC call)
    place the cursor on the particular Transaction ID and go to EDIT in the menu bar --> press 'Execute LUW'
    (Display tRFC calls (will take you to SM58 particular TRFC call) ---> select the TrasnID ---> EDIT ---> Execute LUW)
    Rather than going to SM58 and executing LUW directly it is safer to go through BD87 giving the IDOC name as it will take you
    to the particular TRFC request for that Idoc.
    Thanks,
    JituK

  • Problems while migrating ODI code from 1 env to other

    Hi,
    I had some ODI code in Env1 i moved it to Env2 and did some new development(created new models, packages etc) over there and then tried to migrate it into Env3.What i found that the exports for new development done at Env2 got imported easily in Env3 however, the exports of the code(of Env1) taken from Env2 are giving error ODI-10018: The repository {0} is not coherent between the current repository and the import file.
    This has put me in lot of trouble is there any way to resolve this.

    Well. This problem is not new and not unusual.
    What are the repository IDs for Env 1, Env 2 and Env 3 ?
    I suspect that Env 1 ID and Env 3 ID are same.
    There is an option in ODI 11g to renumber the repository IDs. In Topology Manager -> Repositories and then select and right click.
    You can select Env 2 and select an unused repository number and renumber your repository. Then the objects wilil be easily imported into Env 3.

  • Problem while downloading the data from servlet across SSL

    Hi All,
    I am trying to download the data with the help of servlet across https.
    The below mentioned settings work fine if the application is on http.
    httpservletresponse.setHeader("Cache-Control", "");
    httpservletresponse.setHeader("pragma", "");
    httpservletresponse.setContentType("application/text");
    httpservletresponse.setHeader("Content-disposition","attachment;filename=Download"+dfileid+".txt");
    But , if the application is on https the above settings is working fine if I am saving the file. But at the same time if I am trying to open I am getting the below error from Internet Explorer.
    "Client( IE ) is trying to search the file from the local machine and showing an alert as listed below"
    Alert from the browser
    'C:\Documents and Settings\chalagen\Local Settings\Temporary Internet Files\Content.IE5\I9FGH0RI\Download196[1].csv' could not be found. Check the spelling of the file name, and verify that the file location is correct
    If you are trying to open the file from your list of most recently used files on the File menu, make sure that the file has not been renamed, moved or deleted.
    I have changed the options of Tools > Internet Options> Advanced > Do not save encrypted pages to disk.
    Please do let me know the solutions for the above problem.
    Waiting for response,
    SonaShree

    hi...
    if this problems happens dont display the same internal as u finally got.
    just create new internal table without calling any standard data elements and domains... but the new internal table s similar like ur final internal table and move all the values to new int table.
    for eg.
    ur final internal int table for disp,
         data : begin of itab occur 0,
                        matnr like mara-matnr,
                   end of itab.
    create new like this,
               data : begin of itab occur 0,
                        matnr(12) type N,
                   end of itab.

Maybe you are looking for

  • Material / Service order report

    Hi all Material / service order WBS wise report with net PO valve please suggest me t-code.

  • Stat machine workflow task workflow status different values on different state activity(Approval level)

    i have developed the State machine workflow in which i am using default workflow task i have added one status dropdown and that is having approve or reject that i set using workflow task drop down values now my requirement is that when my workflow go

  • Converting data types

    hi How to convert date data type to number data type: Using to_char and to_number Thanks Edited by: josh1612 on Nov 17, 2008 12:37 AM

  • Lookıng tnsnames.ora from em (rac)

    hi I cannot see the content of tnsnames.ora file from enterprise manager Oracle 10.2.0.3.0 on ibm aix The server is rac with two instances. for both instances, from the enterprise manager I look at at Net Services Administration > Local Naming: but n

  • Linksys Wireless Router: WRT54G Version: 2

    I was just curious, since I have been talking to someone about the Linksys Routers and they were saying that WRT54G Versions 3, 4 & 5, all work successfully. Then when I asked him about Version 2, he said that he THINKS that the router will work. Can