Open Item Import fails with a process_flag = -999 or -888

Problem description
============
The Open Item Import fails with a process_flag = -999 in
mtl_system_items_interface. This keeps the item from being imported into
mtl_system_items. The process_flag has to be set back to 1 to have the row(s)
reprocessed.
Solution
======
UPDATE mtl_system_items_interface
set transaction_type='CREATE'
where transaction_type=<'>
Commit;
Re run the Item Import it should go fine.
Regards,
Mehboob

Hi,
It looks like you have posted the same in below thread.
Item Import completed with Process_Flag = 31, 41 and 42
Any way if you have problem you can refer to below metalink notes.
NOTE:103869.1 - Item Attribute vs Template Attributes Using IOI
NOTE:106812.1 - Instructions for Running Item Open Import (IOI) Including Historical Revision Da
NOTE:109628.1 - FAQ for Item Import
NOTE:268968.1 - Understanding Item Import and Debugging Problems with Item Import
NOTE:458544.1 - Inventory Item Open Interface ITAR Template
NOTE:52746.1 - A Guideline to IOI Error Messages and Solutions
Thanks
Vishalaksha

Similar Messages

  • Open item account line with flow type has to contain a partner

    Hi,
    I am getting the following error while doing periodic posting in flexible real estate management.
    " Open item account line with flow type Z240 has to contain a partner"
    " Open item account line with flow type Z820 has to contain a partner"
    Message no. RERACA005
    I have found a similar message in this forum.
    This error is not coming for other flow type Z120 in the contract.
    The accrual type for accruals is set  as ANRVCN and Accrual type for deferrals is set as TRRVCN in the definition of flow type with S Debit posting.
    For ANRVCN , posting supported is " Only periodic"  and for TRRVCN " All"
    Please guide as to where I need to check and correct.
    Regards,
    T Saravanan

    Hi Saravanan.
    I have the same problem, could you help me to solve it?
    thanks in advance!
    Lucas

  • Open item account line with flow type 0178 has to contain a partner

    At the time of RERAPP Periodic posting getting the error msg
    " Open item account line with flow type 0178 has to contain a partner"
        Message no. RERACA005
    Please guide me on the same.

    Hi,
    As per your suggestion  i have posted the setting for flow type
    In Define Flow type
    140     Security Deposit     S Debit Posting     ANRVCN     TRRVCN
    Assign Reference Flow Types
    10 Follow-Up Postings Due to Condition Increase     140     Security Deposit     140     Security Deposit
    20 Follow-Up Postings Due to Condition Reduction     140     Security Deposit     140     Security Deposit
    30 Distribution Postings (Object Transfers)     140     Security Deposit     140     Security Deposit
    Please guide me

  • Req import fails with POCIRM-24a: ORA-01400:

    Req import fails with.
    POCIRM-24a: ORA-01400: cannot insert NULL into ("PO"."PO_REQ_DISTRIBUTIONS_ALL"."D

    Please post the details of the application release, database version and OS.
    Req import fails with.
    POCIRM-24a: ORA-01400: cannot insert NULL into ("PO"."PO_REQ_DISTRIBUTIONS_ALL"."DPlease follow the steps in these docs for details about the error.
    Adding To Cart Delivers Error - ORA-01400: Cannot Insert NULL Into ("PO"."PO_REQUISITION_LINES_ALL"."DELIVER_TO_LOCATION_ID") [ID 580002.1]
    FAQ: Common Tracing Techniques within the Oracle Applications 11i/R12 [ID 296559.1]
    Thanks,
    Hussein

  • IC Reconciliation AP/AR Open Items (Prozess 003) with SAP ERP

    Dear All
    Does anyone has implemented the IC Reconciliation AP/AR Open Items (Prozess 003) with SAP ERP and SAP ByD allready?
    If yes - is there a standard like RFC within SAP ERP? Or how did you set it up?
    Thank you and regards
    Markus

    Hello Lokesh,
    thank you for your reply.
    We would like to retireve data (AR & AP open items) from SAP Business by Design and transfer it to our SAP ERP in order to do the intercompany reconciliation.
    Are you aware of any standard functionality to transfer data from SAP ByD to the SAP ERP?
    Thanks & best wishes
    Melanie

  • Import failing with no errors

    Import failing with no errors
    All it says in logs for each table is
    . . skipping table "<table_name>"
    It's a full import from a 10.2.03 db into a 10.2.0.3 db

    Hi,
    "INDEXFILE" , this is parameter is specified, index-creation statements for the requested mode are extracted and written to the specified file, rather than used to create indexes in the database. No database objects are imported.
    Due to that that Objects are skipping.
    The INDEXFILE parameter can be used only with the FULL=y, FROMUSER, TOUSER, or TABLES parameters.
    Else you can perform a two step process as stated by Amardeep.
    - Pavan Kumar N

  • MDL IMPORT FAILS WITH MDL1261

    Hi all,
    I am using the OWB Repository Version 10.2.0.4.0 in a AIX server and we face MDL IMPORT FAILS WITH MDL1261
    On Oracle Metalink we found the following information. The datatype and background both are set to 113.
    Does the suggestion to set the values to 10007 apply in this case?
    Thanks !
    Sebastian
    MDL IMPORT FAILS WITH MDL1261
    The upgrade to 11.1.0.7 creates a new property SAP_FTP in the repository. The DATATYPE of this property is incorrectly set to113. The value should be 10007. This wrong datatype causes the NullPointerException because it causes the type not be found when it is expected to be there.
    Solution
    1. Using SQL*Plus, connect to OWBSYS
    2. Check datatype of property SAP_FTP
    SQL> select name, datatype from PROPERTYDEFINITION_V where name = 'SAP_FTP';
    NAME DATATYPE
    SAP_FTP 113
    3. Check datatype of property BACKGROUND
    SQL> select name, datatype from PROPERTYDEFINITION_V where name = 'BACKGROUND';
    NAME DATATYPE
    BACKGROUND 10007
    The value of DATATYPE for the property SAP_FTP and property BACKGROUND should be the same (10007). If this is not the case, the DATATYPE of property BACKGROUND should be changed.
    Proceed to the next step to correct this.
    4. Update PROPERTYDEFINITION_V
    If the value of DATATYPE is not the same for property SAP_FTP and property BACKGROUND, then correct this as follows:
    SQL> update PROPERTYDEFINITION_V
    set DATATYPE = (select datatype from PROPERTYDEFINITION_V where name = 'BACKGROUND') where name = 'SAP_FTP';
    5. Verify updated correctly
    SQL> select name, datatype from PROPERTYDEFINITION_V where name = 'SAP_FTP';
    NAME DATATYPE
    SAP_FTP 10007
    6. If everything looks correct, then commit.
    SQL> commit;
    Commit complete.

    Hi Detlef,
    This is the only error I got. Regards, Sebastian
    error at line 237,069: MDL1261: Error importing MAPPING GG_OPS.4_DWH.GG_MAP_D_OFFICE.
    Detailed Error Message:
    java.lang.NullPointerException
         at oracle.wh.repos.impl.foundation.CMPElement.setElement(CMPElement.java(Compiled Code))
         at oracle.wh.repos.pdl.foundation.WBProxy.uncached(WBProxy.java(Compiled Code))
         at oracle.wh.repos.pdl.foundation.StaticCache.uncache(StaticCache.java:120)
         at oracle.wh.repos.pdl.foundation.CacheMediator.uncacheComponent(CacheMediator.java:1435)
         at oracle.wh.repos.pdl.foundation.UncacheService.uncache(UncacheService.java:495)
         at oracle.wh.repos.pdl.foundation.UncacheService.uncache(UncacheService.java:346)
         at oracle.wh.repos.pdl.foundation.UncacheService.uncache(UncacheService.java:640)
         at oracle.wh.repos.pdl.foundation.MemoryManagerImpl.uncache(MemoryManagerImpl.java:530)
         at oracle.wh.repos.pdl.foundation.MemoryManagerImpl.checkMemory(MemoryManagerImpl.java(Compiled Code))
         at oracle.wh.repos.pdl.foundation.CacheMediator.cache(CacheMediator.java(Compiled Code))
         at oracle.wh.repos.pdl.foundation.CacheMediator.cache(CacheMediator.java(Compiled Code))
         at oracle.wh.repos.impl.ProxyFactoryGen.createCMPMap(ProxyFactoryGen.java:4482)
         at oracle.wh.repos.impl.ProxyFactoryGen.createCMPMap(ProxyFactoryGen.java:4468)
         at oracle.wh.repos.impl.ProxyFactoryGen.createCMPMap(ProxyFactoryGen.java:4463)
         at oracle.wh.repos.pdl.metadataloader.MDLImportCreateUtil.internalImportObject(MDLImportCreateUtil.java(Compiled Code))
         at oracle.wh.repos.pdl.metadataloader.MDLImportCreateUtil.importObject(MDLImportCreateUtil.java(Inlined Compiled Code))
         at oracle.wh.repos.pdl.metadataloader.Import.OutputMDLImport.startElementCreateObj(OutputMDLImport.java(Compiled Code))
         at oracle.wh.repos.pdl.metadataloader.Import.OutputMDLImport.startElement(OutputMDLImport.java(Compiled Code))
         at oracle.wh.repos.pdl.metadataloader.Import.OutputMDLImport.processPendingFCOElements(OutputMDLImport.java(Compiled Code))
         at oracle.wh.repos.pdl.metadataloader.Import.OutputMDLImport.startElementPendingFCOAssoc(OutputMDLImport.java(Compiled Code))
         at oracle.wh.repos.pdl.metadataloader.Import.OutputMDLImport.startElement(OutputMDLImport.java(Compiled Code))
         at oracle.wh.repos.pdl.metadataloader.Import.OutputMDLImport.run(OutputMDLImport.java(Compiled Code))
         at oracle.wh.repos.pdl.metadataloader.converter.foundation.ConverterStateMachine.runAll(ConverterStateMachine.java(Compiled Code))
         at oracle.wh.repos.pdl.metadataloader.converter.foundation.ProcessXML$ConvertHandlerBase.startElement(ProcessXML.java(Compiled Code))
         at oracle.xml.parser.v2.NonValidatingParser.parseElement(NonValidatingParser.java(Compiled Code))
         at oracle.xml.parser.v2.NonValidatingParser.parseRootElement(NonValidatingParser.java:326)
         at oracle.xml.parser.v2.NonValidatingParser.parseDocument(NonValidatingParser.java:293)
         at oracle.xml.parser.v2.XMLParser.parse(XMLParser.java:295)
         at oracle.wh.repos.pdl.metadataloader.converter.util.WBXMLSourceReader.parse(WBXMLSourceReader.java:59)
         at oracle.wh.repos.pdl.metadataloader.converter.foundation.ProcessXML.runConversion(ProcessXML.java:356)
         at oracle.wh.repos.pdl.metadataloader.converter.foundation.ProcessXML.run(ProcessXML.java:290)
         at oracle.wh.repos.pdl.metadataloader.converter.foundation.StateMachine.runAll(StateMachine.java:50)
         at oracle.wh.repos.pdl.metadataloader.converter.WBConverter.ConvertXML(WBConverter.java:568)
         at oracle.wh.repos.pdl.metadataloader.Import.MDLImport.process(MDLImport.java:1859)
         at oracle.wh.repos.pdl.metadataloader.Import.MDLRunImport.internalRunImport(MDLRunImport.java:431)
         at oracle.wh.repos.pdl.metadataloader.Import.MDLRunImport.runImport(MDLRunImport.java:503)
         at oracle.owb.Import.ImportServiceManager.internalImportMetaData(ImportServiceManager.java:506)
         at oracle.owb.Import.ImportServiceManager.importMetaDataFromFile(ImportServiceManager.java:355)
         at oracle.owb.scripting.executers.ImportCmdExecuter.startMetaDataImportCommand(ImportCmdExecuter.java:708)
         at oracle.owb.scripting.executers.ImportCmdExecuter.importFromFile(ImportCmdExecuter.java:150)
         at oracle.owb.scripting.parsers.ImportCmdParser.mdlImportCommand(ImportCmdParser.java:302)
         at oracle.owb.scripting.parsers.ImportCmdParser.ImportCommand(ImportCmdParser.java:152)
         at oracle.owb.scripting.parsers.ImportCmdParser.parseCommand(ImportCmdParser.java:112)
         at oracle.owb.scripting.commands.OMBMetaDataImportCmd.executeCommand(OMBMetaDataImportCmd.java:77)
         at oracle.owb.scripting.commands.OMBCommand.cmdProc(OMBCommand.java:69)
         at tcl.lang.Parser.evalObjv(Parser.java:818)
         at tcl.lang.Parser.eval2(Parser.java:1221)
         at tcl.lang.Interp.eval(Interp.java:2189)
         at tcl.lang.Interp.evalFile(Interp.java:2368)
         at tcl.lang.TclShell.run(TclShell.java:124)
         at tcl.lang.TclShell.run(TclShell.java:68)
         at oracle.owb.scripting.OMBShell.main(OMBShell.java:38)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:85)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:58)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:60)
         at java.lang.reflect.Method.invoke(Method.java:391)
         at Launcher.main(Launcher.java:167)

  • Journal Import fails with EC12 issue

    Journal Import fails with EC12 issue, we knew the AR credit memo transactions causing this issue have distribution lines with accounted_dr value as zero and entered_dr as non zero, same is the case with accounted_cr and entered_cr.
    Issue is with the Credit memos generated out of iReceivables that are creating distributions with null amounts when applied on invoices with rules. Null amounts are on REV/UNEARN distribution lines. Applied patch# 12957348 as suggested in metalink and we noticed more number of such transactions after patch application.
    Any pointers are really appreciated. We have over 5000 such lines and is not possible to clear them manually and since these CM's are phased, we are sure to see them again with this issue till it is resolved. Note# 285995.1 didn't help either.
    We are on 11.5.10.2 and 10g database.
    K

    FOR l_rec IN (SELECT ledger_id,group_id from apps.gl_interface
    WHERE status='NEW'
    AND user_je_source_name='GIS_DATA_CONVERSION'
    GROUP BY ledger_id,group_id
    ORDER BY group_id
    LOOP
    apps.gl_journal_import_pkg.populate_interface_control (user_je_source_name => 'GIS_DATA_CONVERSION',
    GROUP_ID => l_rec.group_id,
    set_of_books_id => l_rec.ledger_id,
    interface_run_id =>vl_interface_id,
    table_name => 'GL_INTERFACE',
    processed_data_action=>'D'
    COMMIT;
    vl_request_id := apps.fnd_request.submit_request (application => 'SQLGL', -- application short name
    program => 'GLLEZLSRS', -- program short name
    description => NULL, -- program name
    start_time => NULL, -- start date
    sub_request => FALSE, -- sub-request
    argument1 => 2065, --Data access set id
    argument2 => 'GIS_DATA_CONVERSION', --Source
    argument3 => l_rec.ledger_id, -- set of books id
    argument4 => l_rec.group_id,
    argument5 => 'N', -- error to suspense flag
    argument6 => NULL, -- create summary flag
    argument7 => 'N' -- import desc flex flag
    COMMIT;
    IF ( vl_request_id = 0 ) THEN
    xxgis.gis_conv_util_pkg.debug_print_p(1,'FND_LOG','E001: Journal Import Submission Failed. ' || SQLERRM);
    retcode := 2;
    EXIT;
    ELSE
    xxgis.gis_conv_util_pkg.debug_print_p(1,'FND_LOG','P001: Submitted Journal Import Program for group id: ' || l_rec.group_id ||
    'and ledger :'||l_rec.ledger_id|| ', Request ID: ' || vl_request_id);
    END IF;
    END LOOP;

  • Item Import - completes with " Completed Error" status and process_flag=4

    I am trying to Import Item through Open Interface.
    RDBMS : 9.2.0.5.0
    Oracle Applications : 11.5.10
    Populated data into mtl_system_items_interface with process_flag = 1.
    Once I run the "Item Import" program it completes with error, with the following in the log file.
    The records are updated with process_flag = 4. but there are no error details in mtl_interface_errors for those corresponding records in the interface table.
    How do I identify the error ?
    +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
    Inventory: Version : 11.5.0 - Development
    Copyright (c) 1979, 1999, Oracle Corporation. All rights reserved.
    INCOIN module: Import Items
    Current system time is 07-SEP-2010 11:39:11
    Page Length = 93, Page Width = 132
    ===================================================================
    Debug Mode : Enabled
    Output to Terminal : No
    Argument Method : Database Fetch
    Trace Mode : Disabled
    ===================================================================
    Argument 1 (ORG_ID) = 174
    Argument 2 (ALL_ORG) = 1
    Argument 3 (VAL_ITEM_FLAG) = 1
    Argument 4 (PRO_ITEM_FLAG) = 1
    Argument 5 (DEL_REC_FLAG) = 1
    Argument 6 (PROCESS_SET) = 11
    Argument 7 (CREATE_UPDATE) = 1
    ===================================================================
    Item Catalog Group Descriptive Elements Open Interface import completed successfully for all records in this record set.
    Item Catalog Group Descriptive Elements Open Interface import completed successfully for all records in this record set.
    Start of log messages from FND_FILE
    End of log messages from FND_FILE
    +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

    Found a few Packages related to Item revision , were in Invalid status.
    Compiled them all and now the items are getting imported without any issue.
    Thanks Sandeep for your word.

  • Disco command line import fails with parser error 7

    Hi, we have discoverer version 10.1.2.48.18 installed to multiple environments.
    Developers have created eex-file and we try to import it to all the test environments, it goes through fine with some but with some it fails with parser error 7 :
    Here is the full error log:
    11/13/2008 7:30:22 AM
    D:\disco1012\BIToolsHome_1\bin\DIS51ADM.EXE /connect /import oudata_disco.eex /identifier /refresh /preserve_workbook_owner /show_progress /log import.log
    oudata_disco.eex:Could not locate the Folder with identifier 'INSTRUCTOR_ASSIGNMENTS1' in the target End User Layer
    oudata_disco.eex:An imported Folder had display name 'Dc transactional sales q2' renamed to 'Dc transactional sales q2 1'
    oudata_disco.eex:An imported Item Class had display name 'Eval_Delivery_Region' renamed to 'Eval_Delivery_Region 1'
    oudata_disco.eex:An imported Item Class had display name 'Quarter' renamed to 'Quarter 1'
    oudata_disco.eex:An imported Item Class had display name 'Track List with ALL' renamed to 'Track List with ALL 1'
    The Item Hierarchy with identifier 'INSTRUCTOR_ASSIGNMENTS_LAST_REFRESH_DATE_DEFAULT_DATE_HIERARCHY' has not been imported because: There are no items in this hierarchy node
    Import completed, but with warnings. Please check the result.
    File(s) imported partially :
    oudata_disco.eex
    11/13/2008 7:36:38 AM
    11/13/2008 7:37:29 AM
    D:\disco1012\BIToolsHome_1\bin\DIS51ADM.EXE /connect /import oudata_disco_workbooks_84.eex /identifier /refresh /preserve_workbook_owner /show_progress /log import.log
    oudata_disco_workbooks_84.eex:A parsing failure has occurred in file 'oudata_disco_workbooks_84.eex'.
    Parser error: '7'
    The import has failed (your End User Layer has not been modified) - oudata_disco_workbooks_84.eex:A parsing failure has occurred in file 'oudata_disco_workbooks_84.eex'.
    Parser error: '7'
    If anyone knows what can be teh cause of this please let me know.
    thanks,
    Nina

    Hi,
    The most likely cause is that the export file has become corrupted. Are there very big export files?
    Another possibility is that Discoverer Administrator has run out of memory while parsing the file.
    Rod West

  • Unable to clear open items through post with Clearing F-04

    Hi Experts,
    I want to clear one GL code through transaction F- 04 Post with Clearing.
    When I clear the open items throgh f-04, (Document types UA , KR , MJ ) it posts the document but again it makes a new line item as a open item with same amount.New line item shows in a list as a open item. ( I tryed F-03 as well)
    I am unable to unable to understand the exact functionality of transacion F-04.
    Can anyone explain me please ? Please advice if I am missing anthing on this.
    Thanks,
    Manasi

    Hi Manasi,
    As the name suggest, F-04 allows you to POST to one account by CLEARING another account.
    For example you have a provision account which needs to be adjusted when the actual expense is paid -  this how it will be done -
    Give the document header details, select "Transfer posting with clearing", enter the posting key (40), give the GL account (expense account as per the example) and go to the next screen.
    Give the amount and other details for the line items and then select "choose open item" from the menu. The new screen will ask you to give an account where you will give the provision account (the account from which the open item needs to be cleared).
    Press enter and you will get the list of open items for the GL account from where you can proceed like any other clearing.
    Hope this clarifies. Do revert in case of any further queries.
    Thanks and Regards,
    anit

  • Creating Open Directory Replica fails with Server Admin Error Value 1127

    Hallo,
    I have seen a lot of similar threads here and they were helpful up to a certain point, but in the end, they did not solve my problem.
    Currently, it comes down to this. The Server Admin Error message ist really meaningless and I could not find a single for the error value on the whole wide web. As such, I switched to the command line versions of the tools involved to geht more meaningful results. It worked. Specifically, creating a replica of an openldap master means using slapconfig.
    When executing
    slapconfig -createreplica master.ourdomain.com diradmin
    as root on the prospective replica machine, I get the following error message:
    ssh command failed with status 127
    That command is not allowed with the root account via public key authentication.
    That makes perfect sense to me, but how is it meant to work then?
    Executing slapconfig as admin tells me that this tool is to be executed as root. On the other hand, root login via ssh is not allowed in Mac OS X by default, which seems fine to me. I even changed /etc/sshd_config on the Open Directory Master machine to "PermitRootLogin yes". However, neither reloading ssh using launchctl nor restarting the whole server made this setting operational. Trying to login from command line as root still tells me:
    root login is not permitted to this machine via public key authentication.
    While this is the current state where I need help urgently, I changed some other things before. I tell about to exclude these issues as possible reason of failure. I got this message for quite a while:
    Replica Setup failed : This machine does not have a valid computer name
    I was sure, this machine meant the target machine, the open directory master, because the domain had changed there once before I had taken over responsibility as an admin in this environment. And in fact, changeip disguised an issue there. The command proposed by changeip to fix the situation did not seem appropriate because this machine is multihomed with a public and a private IP adress. Proper name resolution is available for both interfaces including reverse lookup. I dont like this setup, but it was the only way to get mail service running smoothly. Running changeip on the machine itself using these arguments
    changeip /LDAPv3/127.0.0.1 internalIP internalIP old.ours.com current.ours.com
    reported success in updating password server, open directory, both interfaces, hostconfig (which in fact did not change) and samba. It reported an issue with kadmin which is related to Kerberos (we dont use Kerberos yet).
    Changing the hostname of the server using changeip did not solve the issue. I then found the hint to check with scutil. This showed that the Hostname was not set on the prospective replica machine. (A question aside: in how many place is the hostname stored? The traditional /etc/hostname has gone, but seems to be replaces with several other configuration files and databases. I cant see this as an advantage). Setting the hostname using scutil worked fine. However, it did not solve the problem either. At least, slapconfig now started to complain about not being able to log in as root instead of failing from the start.
    I also checked all log files on bboth machines that might have to do with openldap, as there are /var/log/slapd.log, /var/log/system.log and /Library/Log/slapconfig.log. I also checked the log of th layer on top of openldap which is /Library/Log/DirectoryService.server.log. None of them revealed anything noticeable beside a lot of of entries that I have googled in the last few hours and which all dont seem to be associated with the problem in question.
    I will take a break now, but I have to fix this until tomorrow and I hope to get the ultimate hint from you, dear reader.
    Thanks and bye, Christian Völker

    ssh command failed with status 127
    That command is not allowed with the root account via public key authentication.
    Initial OD replication takes place via 'ssh'. If you have 'sshd' configured on the OD Master to authenticate with public keys then the OD replica will not be able to communicate with the OD Master via 'ssh'. You must configure the OD Master to use 'ssh' with password authentication and root login enabled.
    Demote the replica back to standalone. Stop any services that you may have running on the primary network interface. Then stop any services that you may have running on the secondary network interface. In the 'Network' System Prefpane remove the IP number from the secondary interface then deactivate the secondary network interface.
    Assign the private IP address and hostname that you wish to use for the replica to the primary network interface. Assign the 'public' IP number to the secondary interface. Check the DNS to see that the IP address and hostname for the primary network interface resolve both forward and reverse for the hostname of the replica that you have chosen. If it does not, fix your DNS before proceeding.
    In the 'Sharing' System Prefpane, change the name of the machine to the hostname (server.domain.tld) of the replica that you have chosen. Then use 'changeip -checkhostname' to see if the IP/hostname matches. Fix it if it doesn't.
    Then configure the /etc/sshd_config file on the OD master like this:
    \# Authentication:
    PermitRootLogin yes
    PasswordAuthentication yes
    PubkeyAuthentication no
    and the /etc/ssh_config file on the OD replica like this:
    PasswordAuthentication yes
    PubkeyAuthentication no
    Then from the OD replica as the 'root' user issue:
    slapconfig -createreplica <ODMasterIPorFQDN> <diradmin user>
    Make sure that the 'diradmin' user's password contains only alpha-numeric characters -no 'option-characters' or symbols, change it first if it does. Once the process completes, reactivate the secondary interface for the 'public' IP and check the configuration of services that will be using that IP, then start your other services. Secure the 'ssh' service on both machines to disable password authentication and 'root' logins.

  • Open Item interest calculation with different exchange rate for each line

    Dear All
    We have a short term loan facility with our Bank. We take short term loan with bank on different terms and condition. Now user has requirement to have different interest rate for each open items depending on the posting date or any other condition. I think in SAP for one bank account only one rate is applicate for validity from date.
    Can someone share his/her experience if got the same requirement from user
    Thanks and Best Regards
    Farhan Qaiser

    Dear Ashwin,
    As of overdue days are concerned there is no standard functionality available whereas there is for amount.
    Better solution is that you get an ABAPer create your own functional module. That will help you in better way.

  • Import failed with canon 6d footage after latest update

    After I updated to Final Cut X 10.0.8, I've been getting "Import Failed" messages during the import process when taking in footage from my Canon 6D
    The import process also seems to be going very slow (slower than usual)
    I've tried duplicating the SD card locally on the hard drive and re-importing, but have gotten the same results.
    If I click OK on the error dialogue box, the import will continue. but another box will pop up every 5 to 10 minutes or so
    Any ideas?

    Hey Russ, it actually works fine for me too the few times I've used it but I point it out as a common denominator with people who have trouble importing that way. Separating the .mov files into their own folder eliminates the camera and card reader, both much slower than just transfering files from a connected drive. But I'm doing newspaper video with Canon DSLRs - including 5D3 video, similar to 6D - almost daily where speed is of the essence and this is by far the most foolproof method I've come across.

  • Import fails with unable to extend table CUSTOM.CASA_TRAN_HIST_UPLD by 6999

    Hi,
    I have taken export backup of table from 9.2.0.4 on AIX & trying to import in 11.1.0.7.0 on AIX
    while importing im getting the following error.
    ORA-01653: unable to extend table CUSTOM.CASA_TRAN_HIST_UPLD by 699912 in tablespace CUSTOM
    As the table size is 37G , total free space in tablespace is 40G,
    & no index on the table.
    following are sum lines from import file
    Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Export file created by EXPORT:V09.02.00 via direct path
    import done in US7ASCII character set and UTF8 NCHAR character set
    import server uses AL32UTF8 character set (possible charset conversion)
    export server uses AL16UTF16 NCHAR character set (possible ncharset conversion)
    . importing DATAMIG's objects into CUSTOM
    . . importing table "CASA_TRAN_HIST_UPLD"
    IMP-00058: ORACLE error 1653 encountered
    ORA-01653: unable to extend table CUSTOM.CASA_TRAN_HIST_UPLD by 699912 in tablespace CUSTOM
    IMP-00028: partial import of previous table rolled back: 62844421 rows rolled back
    IMP-00017: following statement failed with ORACLE error 1917:
    "GRANT SELECT ON "CASA_TRAN_HIST_UPLD" TO "BSGUSER""
    IMP-00003: ORACLE error 1917 encountered
    ORA-01917: user or role 'BSGUSER' does not exist
    Import terminated successfully with warnings.
    is there any to resolve the issue.
    how to change NCHAR set for import.
    Thanks

    Hello,
    which & how i can set character set for import.About the Character Set, it's a setting at the Database creation. You may check it by using the following query on the Source and Target Databases:
    select * from v$nls_parameters; The NLS_CHARACTERSET will give you the Character set of the Database.
    It cannot be changed easily. It may imply a Database creation and export/import of data ( see Note *225912.1* ).
    Else, when you export (with the Original Export/Import utility) it's recommended to set the NLS_LANG parameter.
    The NLS_LANG parameter has 3 components:
    - Language
    - Territory
    - Client Character Set
    A wrong setting of the NLS_LANG may lead to conversion. However starting with *9i* most data is exported with the Character Set of the Database regardless the NLS_LANG setting. The following note may give you some details about it:
    Export/Import and NLS Considerations [ID 15095.1]Hope this help.
    Best regards,
    Jean-Valentin

Maybe you are looking for