Journal Import Attributes not processed
Hi,
I'am loading data in an interface table to create journal lines using Standard Journal Import program and I am passing values in attributes fields in my interface table. The attributes are for the table GL_JE_LINES.
After processing of Journal Import, my journal lines are created but no information appears in the attribute fields.
I have pass Context value as well.
I can't find why attribute values are not taken by Journal import.
Please help
Vik
Hello,
Yes it was argument7 when submitting the request via fnd_request.submit - the value should be either O- without validation or W- with validation.
, p_argument1 => l_interface_run_id
, p_argument2 => TO_CHAR(fnd_profile.value('GL_ACCESS_SET_ID'))
, p_argument3 => 'N'
, p_argument4 => NULL
, p_argument5 => NULL
, p_argument6 => 'N'
, p_argument7 => 'O'-- import dff
, p_argument8 => 'Y'
Thx
vik
Similar Messages
-
Hi All,
I am importing Sub Agreement in XML format and getting below error.
XML import could not process field: CONTRACT_UNIQUE_DOC_NAME
unsupported null bind parameter no. 0
Below is the XML.
<?xml version="1.0" encoding="UTF-8"?>
<sapesourcing xsi:noNamespaceSchemaLocation="Contracts.xsd" defaultlanguage="" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<objects>
<object classname="contracts.Agreement">
<fields>
<CONTRACT_UNIQUE_DOC_NAME>P-XXXX-00001</CONTRACT_UNIQUE_DOC_NAME>
<DISPLAY_NAME>Testing Agreement Name 1 </DISPLAY_NAME>
<CALC_METHOD>UNIT_PRICE_ENTERED</CALC_METHOD>
<UNIQUE_DOC_NAME>S0002510</UNIQUE_DOC_NAME>
<STATUS>In Process</STATUS>
<COMPANY>TEST</COMPANY>
<LOCATION>Location_114</LOCATION>
</fields>
</object>
</objects>
</sapesourcing>
If anyone is successful in importing Sub Agreement in XML format? also please validate the above xml structure?
Note: I am able to load Sub Agreement using CSV. But in our scenario data comes in XML format.
Thanks & Regards
SaiHi Gary,
I am able to load below objects in XML format without any issues. And I am using Scheduled Task of type “Data Import Monitor” to load them.
1) Project (1100)
2) Master Agreement (1004)
3) Extension Collections for Project
4) Extension Collections for Master Agreement
1) Working XML Structure for Project(1100) .
<?xml version="1.0" encoding="UTF-8"?>
<sapesourcing xsi:noNamespaceSchemaLocation="Projects.xsd" defaultlanguage="" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<objects>
<object classname="projects.projects">
<fields>
<UNIQUE_DOC_NAME>S0004392</UNIQUE_DOC_NAME>
<DISPLAY_NAME>Test Project</DISPLAY_NAME>
2) Working XML Structure for Master Agreement (1004) object.
<?xml version="1.0" encoding="UTF-8"?>
<sapesourcing xsi:noNamespaceSchemaLocation="Contracts.xsd" defaultlanguage="" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<objects>
<object classname="contracts.Contract">
<fields>
<UNIQUE_DOC_NAME> 00215944</UNIQUE_DOC_NAME>
<DISPLAY_NAME>TEST MA</DISPLAY_NAME>
<DOCUMENT_DESCRIPTION>Contract Desc</DOCUMENT_DESCRIPTION>
3) Working XML Structure for Extension Collections for Project(1100).
<?xml version="1.0" encoding="UTF-8"?>
<sapesourcing xsi:noNamespaceSchemaLocation="Projects.xsd" defaultlanguage="" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<objects>
<object classname="projects.projects">
<fields>
<UNIQUE_DOC_NAME> S0004392</UNIQUE_DOC_NAME>
</fields>
<collections>
<Z_DETAILS replace="true">
<object classname="extension_collections">
<fields>
<NAME>Sai</NAME>
<EMAIL>[email protected]</EMAIL>
</fields>
</object>
<object classname="extension_collections">
<fields>
<NAME>Krishna</NAME>
<EMAIL>[email protected]</EMAIL>
</fields>
</object>
</ Z_DETAILS >
</collections>
</object>
</objects>
</sapesourcing>
4) Working XML Structure for Extension Collections for Master Agreement (1004).
<?xml version="1.0" encoding="UTF-8"?>
<sapesourcing xsi:noNamespaceSchemaLocation="Contracts.xsd" defaultlanguage="" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<objects>
<object classname="contracts.Contract">
<fields>
<UNIQUE_DOC_NAME>00215944</UNIQUE_DOC_NAME>
</fields>
<collections>
<ZLINKS replace="true">
<object classname="extension_collections">
<fields>
<LINK_NAME>SAP</ LINK_NAME>
< LINK_URL> http://www.sap.com/index.html </ LINK_URL >
</fields>
</object>
<object classname="extension_collections">
<fields>
<LINK_NAME>SCN Sourcing</ LINK_NAME>
< LINK_URL>http://scn.sap.com/community/sourcing</ LINK_URL >
</fields>
</object>
</ ZLINKS >
</collections>
</object>
</objects>
</sapesourcing>
Could you please help me with the right XML Structure for Agreement( 1003)?
Regards,
Sai -
Auto Invoice Import Program not processing records
Hi,
I wrote a procedure at back end to submit the auto invoice import program. I find concurrent program being submitted but none of the records are processed.
Please find the code given below for reference.
APPS Version : 11.5.10.2
fnd_global.apps_initialize(2709,50325,222);
l_request:= FND_REQUEST.SUBMIT_REQUEST
(application => 'AR',
program => 'RAXTRX',
description => 'Auto',
start_time => NULL,-- To start immediately
sub_request => FALSE,
argument1 => 'MAIN',
argument2 => 'T',
argument3 => '24',--batch_source_id
argument4 => 'AR Batch Source', --batch_source_name
argument5 => to_char(SYSDATE,'YYYY/MM/DD HH:MM:SS'), -- should be in format -- RR-MON-DD
argument6 => '',
argument7 => '',
argument8 => '',
argument9 => '',
argument10 => '',
argument11 => '',
argument12 => '',
argument13 => '',
argument14 => '',
argument15 => '',
argument16 => '',
argument17 => '',
argument18 => '',
argument19 => '',
argument20 => '',
argument21 => '',
argument22 => '',
argument23 => '',
argument24 => '',
argument25 => '',
argument26 => 'Y',
argument27 => 'Y',
argument28 => '',
argument29 => 155, -- org_id
argument30 => chr(0) -- end with chr(0)as end of parameters
COMMIT;
- - - - - - - - - - - - - - - - - - - - - - -- - - - - - - - - - - - - - - - - - - - - - -- - - - - - - - - - - - - - - - - - - - - - -- - - - - - - - - - - - - - - - - - - - - - -- - - - - - - - - - - - - - - - - - - - - - -Pl post the log of the concurrent request. Pl see if MOS Doc 1089172.1 (Troubleshooting Autoinvoice Import - Execution Report Errors (Request Status = Completed)) can help
HTH
Srini -
Hi All,
Environment :
- Solaris 11
- EBS R12
- Oracle DB 11.2.0.3.0
Journal Import Profile option is default (still no settings / default settings / not still tune).
Our Journal Import could not submit more than 4000 records.
When we try to import data more than 4000, there is an error SQL*Net message to client on OAM view log, and when I looking on LAB128 and OEM, the process stop mostly on gl.gl_je_lines, sometimes on gl.gl_je_headers.
I had already followed suggestion from metalink forum :
change DB : sqlnet.ora, listener.ora
add sqlnet.ora on Apps Server (EBS R12 Server)
But, now there is another issue, the error SQL*Net message to client not raise anymore, but when I see Journal Import OAM log, LAB128 and OEM, the sessions is inactive but the process looks like looping forever.
Need suggestion solution for this issue.
Any help is very appreciated and score. Thanks before.
Best Regards,
Yohanes Hany Haryadi Widodo
Edited by: SigCle on May 7, 2013 2:11 AMDear Mr. Hussein,
Our EBS version : 12.1.3, while
R12:Journal Import Failing With ORA-24337 Error When Importing All GROUP IDs [ID 1159594.1] Oracle General Ledger - Version: 12.0.0 to 12.1.2 - Release: 12.0 to 12.1
I'm already change the settings of :
Please revert to Oracle Net Server tracing/logging, set following parameter in the server's sqlnet.ora :
DIAG_ADR_ENABLED = OFF
- to back out the ADR diag for the Listener component, set following parameter in the server's listener.ora:
DIAG_ADR_ENABLED_<listenername> = OFF
- Where the <listenername> would be replaced with the actual name of the configured listener(s) in the listener.ora configuration file. For example, if the listener name is 'LISTENER', the parameter would read:
DIAG_ADR_ENABLED_LISTENER = OFF
-Reload or restart the TNS Listener for the parameter change to take effect.
ACTION PLAN
============
We will need to trace a connection on both CLIENT and SERVER endpoints to see what is happening. Please follow these steps:
1. Add the following parameters in the sqlnet.ora file on the CLIENT workstation (where sql loader is executed):
TRACE_LEVEL_CLIENT=16
TRACE_DIRECTORY_CLIENT=<some_known_directory>
TRACE_FILE_CLIENT=client
TRACE_UNIQUE_CLIENT=ON
TRACE_TIMESTAMP_CLIENT=ON
DIAG_ADR_ENABLED =OFF -- add this in case of 11g client
If you need to restrict the amount of disk space used by the long-term traces then you can also set the following:
TRACE_FILELEN_CLIENT=<file_size_in_Kbytes>
TRACE_FILENO_CLIENT=<number_of_files>
2. Add the following parameters in the sqlnet.ora file on the SERVER:
TRACE_LEVEL_SERVER=16
TRACE_DIRECTORY_SERVER=<some_known_directory>
TRACE_FILE_SERVER=server
TRACE_TIMESTAMP_SERVER=ON
DIAG_ADR_ENABLED =OFF
If you need to restrict the amount of disk space used by the long-term traces then you can also set the following:
TRACE_FILELEN_SERVER=<file_size_in_Kbytes>
TRACE_FILENO_SERVER=<number_of_files>
3. Try to reproduce the issue.
4. Check if trace files were created.
5. Disable tracing by removing the TRACE lines from sqlnet.ora on both CLIENT and SERVER.
6. Compress (in .zip or .tar.gz format) and upload the trace files.
We only need a pair of client and server trace files for the same sqlplus session which exhibits the issue; in order to match client and server trace files you should use the tips in Note:374116.1 "How to Match Oracle Net Client and Server Trace Files"below is result of failing Journal Import :
+---------------------------------------------------------------------------+
General Ledger: Version : 12.0.0
Copyright (c) 1979, 1999, Oracle Corporation. All rights reserved.
GLLEZL module: Journal Import
+---------------------------------------------------------------------------+
Current system time is 07-MAY-2013 19:43:26
+---------------------------------------------------------------------------+
gllsys() 07-MAY-2013 19:43:26
fnd_user_id = 1164
fnd_user_name = CN_FAH_MANAGER
fnd_login_id = 134937
con_request_id = 491859
sus_on = 0
from_date =
to_date =
create_summary = 1
archive = 0
num_rec = 25000
run_id = 6415
<< gllsys() 07-MAY-2013 19:43:26
SHRD0108: Retrieved 202 records from fnd_currencies
gllcnt() 07-MAY-2013 19:43:26SHRD0118: Updated 1 record(s) in table: gl_interface_control
source name = CN FAH Credit Card
interface source name = CN FAH Credit Card
group id = 17232
ledger_id = -1
LEZL0001: Found 1 sources to process.
glluch() 07-MAY-2013 19:43:26
<< glluch() 07-MAY-2013 19:43:26
gl_import_hook_pkg.pre_module_hook() 07-MAY-2013 19:43:26
<< gl_import_hook_pkg.pre_module_hook() 07-MAY-2013 19:43:26
glusbe() 07-MAY-2013 19:43:26
<< glusbe() 07-MAY-2013 19:43:26
<< gllcnt() 07-MAY-2013 19:43:26
gllacc() 07-MAY-2013 19:43:26
<< gllacc() 07-MAY-2013 19:43:26
gllenc() 07-MAY-2013 19:43:26SHRD0108: Retrieved 6 records from gl_encumbrance_types
<< gllenc() 07-MAY-2013 19:43:26
gllfss() 07-MAY-2013 19:43:26LEZL0005: Successfully finished building dynamic SQL statement.
<< gllfss() 07-MAY-2013 19:43:26
gllcje() 07-MAY-2013 19:43:26
gllalb() 07-MAY-2013 19:43:26
<< gllalb() 07-MAY-2013 19:43:26
glllgr() 07-MAY-2013 19:43:34
gllpst() 07-MAY-2013 19:43:34SHRD0108: Retrieved 45 records from gl_period_statuses
<< gllpst() 07-MAY-2013 19:43:34
gllbud() 07-MAY-2013 19:43:34
<< gllbud() 07-MAY-2013 19:43:34
currency = IDR
sus_flag = N
ic_flag = Y
bc_flag = N
latest_opened_encumbrance_year = 2011
<< glllgr() 07-MAY-2013 19:43:34
SHRD0108: Retrieved 200 records from gl_je_categories
<< gllged() 07-MAY-2013 19:43:34
<< gllged() 07-MAY-2013 19:43:34
<< gllged() 07-MAY-2013 19:43:34
<< gllged() 07-MAY-2013 19:43:34
<< gllged() 07-MAY-2013 19:43:34
<< gllged() 07-MAY-2013 19:43:34
<< gllged() 07-MAY-2013 19:43:34
<< gllged() 07-MAY-2013 19:43:34
<< gllged() 07-MAY-2013 19:43:34
<< gllged() 07-MAY-2013 19:43:34
<< gllged() 07-MAY-2013 19:43:34
<< gllged() 07-MAY-2013 19:43:34
<< gllged() 07-MAY-2013 19:43:34
<< gllged() 07-MAY-2013 19:43:34
<< gllged() 07-MAY-2013 19:43:34
<x gllcje() 07-MAY-2013 19:49:18
Error in: gllcje
Function return status: 0
Function Err Message: Executing upd_prep using descriptor updbindda
Function warning number: -1
sqlcaid: sqlabc: 0 sqlcode: -3113 sqlerrml: 48
sqlerrmc:
ORA-03113: end-of-file on communication channel
sqlerrp: sqlerrd: 0 1 0 0 0 538976288
sqlwarn: sqltext:
*****************************************************SHRD0044: Process logging off database and exiting ...
+---------------------------------------------------------------------------+
Start of log messages from FND_FILE
+---------------------------------------------------------------------------+
+---------------------------------------------------------------------------+
End of log messages from FND_FILE
+---------------------------------------------------------------------------+
ORACLE error 3114 in AFPRSR-Resubmit_Time
Cause: AFPRSR-Resubmit_Time failed due to ORA-03114: not connected to ORACLE
The SQL statement being executed at the time of the error was: and was executed from the file .
Routine FDPCLS encountered an error changing request 491859 status.
Contact your support representative.
ORACLE error 3114 in close_server_files
Cause: close_server_files failed due to ORA-03114: not connected to ORACLE.
The SQL statement being executed at the time of the error was: &SQLSTMT and was executed from the file &ERRFILE.
ORACLE error 3114 in fetch_lines
Cause: fetch_lines failed due to ORA-03114: not connected to ORACLE.
The SQL statement being executed at the time of the error was: &SQLSTMT and was executed from the file &ERRFILE.
ORACLE error 3114 in open_server_files
Cause: open_server_files failed due to ORA-03114: not connected to ORACLE.
The SQL statement being executed at the time of the error was: &SQLSTMT and was executed from the file &ERRFILE.
ORACLE error 3114 in close_user_handles
Cause: close_user_handles failed due to ORA-03114: not connected to ORACLE.
The SQL statement being executed at the time of the error was: &SQLSTMT and was executed from the file &ERRFILE.
ORACLE error 3114 in FDPCLS
Cause: FDPCLS failed due to ORA-03114: not connected to ORACLE
The SQL statement being executed at the time of the error was: lock TABLE FND_CONCURRENT_REQUESTS IN SHARE UPDATE MODE and was executed from the
/u02/oracle/PFT/apps/apps_st/appl/gl/12.0.0/bin/GLLEZL
Program exited with status 1
+---------------------------------------------------------------------------+
Executing request completion options...
Output file size:
0
Finished executing request completion options.
+---------------------------------------------------------------------------+
Concurrent request completed
Current system time is 07-MAY-2013 19:49:18
+---------------------------------------------------------------------------+I don't know how ORA-03114: not connected to ORACLE could happened.
My temporary suspect is on network issue between Apps Server and DB Server (Two-Tier) because with AutoBatch for FAH (Financial Accounting Hub), there is 5 Journal Import Processing on Request and only this one was error.
My team had raised this issue to SR two days ago, hopefully few hours again, we could see their answer.
Best Regards,
Yohanes
Edited by: SigCle on May 14, 2013 3:05 AM -
Security rule whether be checked when journal import
Hi All,
Thanks for your attention, I have got an issue about security rule.
When I used gl_interface to import journals into EBS system, I think the security rules will not be checked, it only check cross-validation, but I found sometimes security rule is checked for some responsibilities when journal import, why this happened, and is there any profile or setup to control it?
Thanks for your help.
Best Regards
SparkHi Spark,
It looks like Journal Import doesn't check for Security rules, but it checks for cross validation rules upon dynamic insertion. Sorry for the misled earlier. You can check the metalink note, Journal Import - FAQ [ID 107059.1]. Here are the comments for the note,
A04. Does the Journal Import process check for Cross-Validation or Security
Rules violations?
Journal Import does not check security rules. Transactions that come
from Oracle subledgers (AR, AP, etc.) already have the CCID (Code
Combination ID) in the GL_INTERFACE table. These have been validated
in the feeder system.
You can also populate the accounting segments directly into the
gl_interface table and let Journal Import populate the
code_combination_id. If dynamic insertion is enabled, and this is a
new combination, then the import program will check for cross
validation rule violations.
Thanks,
Kiran -
Hi
Please advise on below requirement:
Business users are uploading journals using web ADI and one unbalanced journal is also uploaded, now while importing journal program is error out because of unbalanced journal entry and correct journal are also stuck in interface.
We are deleting the unbalanced record from GL_INTERFACE and journal import works fine.
Please suggest the way that journal import will not fail because of one wrong journal entry for other correct entries will imported successfully.
Best Regards,
AMITHi Amit,
Please suggest the way that journal import will not fail because of one wrong journal entry for other correct entries will imported successfully.
I am not quiet sure whether this is recommended and achievable, hopefully other experts may have a valuable input
We are deleting the unbalanced record from GL_INTERFACE and journal import works fine.
For clearing the incomplete imported Journal l entries, Please try running the program "Delete Journal Import Data" as GL user. And select the respective request ID which failed.
Thanks &
Best Regards, -
I am trying to run journal import once on the import journals page source selected is Manual, when selecting ledger the following error occurs **FRM-41830 list of values contains no entries** am I missing something with my security ?
Hi,
Please see these if these documents help.
Note: 1082995.6 - FRM-40212 Invalid Value for Field Selecting Group_Id For a Journal Import Run
Note: 301721.1 - GLLEZL Journal Import For Purchasing Retuns No Data Found
Note: 400810.1 - FRM-41830: List of Values Contains No Entries When Attempting to Upload Budget Amounts
Note: 160064.1 - FRM-41830 in Journal Import Delete Form
Regards,
Hussein -
Journal Import - Multiple Reporting Curriencies
Can someone please let me know how i can handle Journal Import Process for Multiple Currencies, Please let me know the validation i need to be perform.
the whole error i am facing is below :
group id = 452132
LEZL0023: Journal Import can only process data from one table at a time.
<x gllcnt() 19-OCT-2012 00:33:21
Error in: gllcnt
Function return status: 0
Function Err Message: multiple tables
Function warning number: -1
sqlcaid: sqlabc: 0 sqlcode: 1403 sqlerrml: 25
sqlerrmc:
ORA-01403: no data found
1 Concurrent request ended in an error -
Journal Import finds no records in GL_INTERFACE for processing.
Hi,
I'm using the Oracle Web ADI (11i) to upload journal entries form a spreedsheat to GL.
When the request is finished, this message is shown on the out put :
Journal Import finds no records in GL_INTERFACE for processing.
Check SET_OF_BOOKS_ID, USER_JE_SOURCE_NAME, and GROUP_ID of import records.
If no GROUP_ID is specified, then only data with no GROUP_ID will be retrieved. Note that most data
from the Oracle subledgers has a GROUP_ID, and will not be retrieved if no GROUP_ID is specified.
Can you help me to reslove it.
Thx.Hi Msk;
You have below errors,
LEZL0008: Found no interface records to process.
LEZL0009: Check LEDGER_ID, GROUP_ID, and USER_JE_SOURCE_NAME of interface records.Journal Import Finds No Records in gl_interface for Processing [ID 141824.1]
Journal Import Finds No Records in GL_INTERFACE For Processing For AX and AP Sources [ID 360994.1]
GLMRCU does not populate GL_INTERFACE to produce journal for reporting sob [ID 1081808.6]
Regards
Helios -
Payables Transfer to General Ledger does not kick off Journal Import
Hello,
This is on 11.5.10.2.
We have some invoice batches that are not posted in GL. This problem is sporadic and only happens for some batches and not all. It is unclear to why some batches are posted fine and others are not.
The problem is at payable trasnfer to General Ledger program. After these batches are validated and create accounting has been executed successfully, we submit Payables Trsnfer to General Ledger program.
This program does not kick of Journal Import child process for these batches. It just completes normally without showing any error detail on log/output.
What could be the reason? Why Payables Transfer to General Ledger is not kicking off Journal Import for these batches?
I have even verified GL_INTERFACE and there is nothing there.
Please advise.
Thanks
DAre these invoices accounted, and was create accounting done for these Invoices?
If so these should be available for the import. -
Import Server detects file in Ready folder but does not process
I have a problem where MDIS detects a file in the ready folder, but it does not launch the import:
2772 2008/10/22 23:37:52.240 1.xml
Source file retrieval: Delta: 0.001451 seconds.
2772 2008/10/22 23:38:23.428 1.xml
Source file retrieval: Delta: 0.001151 seconds.
2772 2008/10/22 23:38:54.678 1.xml
Source file retrieval: Delta: 0.006335 seconds.
2772 2008/10/22 23:39:25.897 1.xml
Source file retrieval: Delta: 0.011595 seconds.
2772 2008/10/22 23:39:57.272 1.xml
Source file retrieval: Delta: 0.003268 seconds.
2772 2008/10/22 23:40:28.507 1.xml
Source file retrieval: Delta: 0.006276 seconds.
The log file shows the above. You can see it detects the file is there, but since it does not launch the import it does not process the file or even move it to the exception or archive folders. I have confirmed settings in the MDIS.ini file, e.g. password, etc. I can import the same file using Import Manager without any errors.
Please help!Hi Jitesh,
Your suggestion to try a different file type yielded positive reults. I created a simple tab delimited text file and import map and changed the existing Port from using XML based on the MATMAS05 schema to using the text file. The log shows:
6056 2008/10/23 17:32:34.066 [MDS=ROBLB2K1H2J Repos=JAN Master ClientSystem=U2K2_MDM Port=Materials_from_U2K2_MDM]: ImportTask: Task started. Chunk size[50000], No. parallel chunks[5]
4572 2008/10/23 17:32:34.144 [MDS=ROBLB2K1H2J Repos=JAN Master ClientSystem=U2K2_MDM Port=Materials_from_U2K2_MDM]: xStructuralTransformer: Thread started.
1160 2008/10/23 17:32:34.144 [MDS=ROBLB2K1H2J Repos=JAN Master ClientSystem=U2K2_MDM Port=Materials_from_U2K2_MDM]: xValueTransformer: Thread started.
4192 2008/10/23 17:32:34.144 [MDS=ROBLB2K1H2J Repos=JAN Master ClientSystem=U2K2_MDM Port=Materials_from_U2K2_MDM]: xImporter: Thread started.
4572 2008/10/23 17:32:34.144 [MDS=ROBLB2K1H2J Repos=JAN Master ClientSystem=U2K2_MDM Port=Materials_from_U2K2_MDM]: xStructuralTransformer: Thread finished; Start -> End: 0.000000000 seconds.
1160 2008/10/23 17:32:34.160 [MDS=ROBLB2K1H2J Repos=JAN Master ClientSystem=U2K2_MDM Port=Materials_from_U2K2_MDM]: xValueTransformer: Thread finished; Start -> End: 0.000000000 seconds.
4192 2008/10/23 17:32:34.660 Repository Load Successful. [JAN Master]: Delta: 0.085377 seconds.
4192 2008/10/23 17:32:35.958 [MDS=ROBLB2K1H2J Repos=JAN Master ClientSystem=U2K2_MDM Port=Materials_from_U2K2_MDM]: xImporter: Thread finished; Start -> End: 1.000000000 seconds.
6056 2008/10/23 17:32:36.458 [MDS=ROBLB2K1H2J Repos=JAN Master ClientSystem=U2K2_MDM Port=Materials_from_U2K2_MDM]: ImportTask: Task finished. Chunk size[50000], No. parallel chunks[5]
4192 2008/10/23 17:34:11.133 22355-3-1-10-23-08-06-32-57_081023-063303_999.xml
Source file retrieval: Delta: 0.006444 seconds.
4192 2008/10/23 17:34:11.242 Repository Load Successful. [JAN Master]: Delta: 0.028803 seconds.
4572 2008/10/23 17:34:43.825 22355-3-1-10-23-08-06-32-57_081023-063303_999.xml
Source file retrieval: Delta: 0.003005 seconds.
4572 2008/10/23 17:35:16.018 22355-3-1-10-23-08-06-32-57_081023-063303_999.xml
Source file retrieval: Delta: 0.010113 seconds.
4572 2008/10/23 17:35:47.742 22355-3-1-10-23-08-06-32-57_081023-063303_999.xml
Source file retrieval: Delta: 0.001994 seconds.
4572 2008/10/23 17:36:19.419 22355-3-1-10-23-08-06-32-57_081023-063303_999.xml
Source file retrieval: Delta: 0.012413 seconds.
The text file was immediately picked up and imported. This thus proves that (1) MDIS is functioning, (2) user / password is correct. Unfortunately you will also see from the log above that when I changed the port back to run on XML the problem has not been resolved.
Any ideas? -
Hi All,
I have successfully completed the journal import process .
Now, If I want to run this Journal Import process using FND_SUBMIT.SUBMIT_REQUEST wrote in procedure then what parameters I have to pass ?
Request to guide me
Thanks
SanjaySanjay
You need to insert rows into gl_interface_control first using:
gl_journal_import_pkg.populate_interface_control (user_je_source_name => p_je_source_name,
GROUP_ID => p_group_id,
set_of_books_id => p_ledger_id,
interface_run_id => p_interface_run_id,
table_name => p_table_name,
processed_data_action => p_action
And then call Journal import using:
fnd_request.submit_request (application => 'SQLGL', -- application short name
program => 'GLLEZL', -- program short name
description => NULL, -- program name
start_time => NULL, -- start date
sub_request => FALSE, -- sub-request
argument1 => p_interface_run_id, -- interface run id
argument2 => 1, -- set of books id
argument3 => 'N', -- error to suspense flag
argument4 => NULL, -- from accounting date
argument5 => NULL, -- to accounting date
argument6 => l_summary_flag, -- create summary flag
argument7 => 'N', -- import desc flex flag
argument8 => 'Y' -- Data security mode flag
Thanks
Nagamohan -
Tp import flag not properly set in "STMS", tp-processe not ended
since last Stopping of our Q-System the display in of the import queue doesnt work fine. when we transport requests from the development-system to the q-system the truck-icon doesn't disappear. the processes in AIX are not finished, after time the "overview of the transport logs" shows that the import is done, but the truck doesn't disaappear
we tested the transport tool (RSTPTEST): o.k., in the TMS configuration we made the communication test: o.k.
sometime we can make the truck disappear, when we delete the .LOB in /usr/sap/trans/tmp but this doesn't work with all requestsCheck the Import monitor and the tp system log. If the truck icon still shows is most likely because the Import is not fully finished, As I said check the import monitor and let us know what it says.
Regards
Juan -
All users can not see some of BPA 'free attributes' in process models
All,
We are using BPA 10134 to model processes.We have a very strange issue
When we login using a particular user from particular machine we can see following attributes for a human task
USer Attribute Text1
User Attribute Text2
User Attribute Text3
but when we login using other user, we could not see same attributes for process models
Anybody has any idea what are we missing ?
Regards,
PraveenHi Ashish,
I am new to BPA. So I am really not sure what filters means and how to setup.
Anyway I will do bit explore and try to resolve it.If you have any nice article on BPA filters please post here
regards,
Praveen -
How not to get a 'Journal Import Created' description at the Journal Entry Lines?
For records loaded thru the Journal import,
I always get a 'Journal Import Created' description at the Journal Entry Lines.
Instead of this description, I want a
more useful information which I may
include at the GL_INTERFACE table
possibly at the REFERENCE10 field.
For those who will reply to this question,
thank you very much in advance.
nullTo populate this journal description, the Reference10 field must be populated in the GL_INTERFACE table.
However, I have also discovered (version 11.03) that if you populate this field, but then import summarized journals, Oracle discards the value and reverts to "Journal Import Created" unless the value in the reference10 field is identical for all records to be summarized.
null
Maybe you are looking for
-
Hi, Where did you find Adobe Flash Video Encoder, please.
-
Crashing on Mac When Trying to Import?
I just installed the CS5 master collection on my Macbook today, and I have been having a weird problem with After Effects. Whenever I go to import a video (File -> Import), the program will either freeze or crash. The weird thing is, it's about hal
-
Printing/Uploading a book mechanisms
I have not yet ordered my first book (I've always gone through a local agency via PDFs). However, I am thinking about trying the Apple service but am unsure about how practical it will be for me. I have a lot of large image files in a book I am prepa
-
Help with file uploader, php script, Windows Authentication
I am trying to setup a really basic web-based file uploader that I will expand upon later. I have the flex application working well enough (very basic). However, I have a php script in a secure folder using windows authentication. When I try to send
-
Photoshop cs5 I have a picture with a lot of layers. One of the layers I would like save as a separate layer that is not part of the images, for after that to use the stamp tools to "copy" from the new lay and in my original image. When do I copy a l