Journal Import Issue

Hi
Our organization have a multiple branches.
we are generating the Branchwise Journal vouchers in a single batch for payroll salary .
For the same , we are using Journal Import program and runs successfully also.
Now , we want to create a Document Number for each J.V. in this single batch.
Can anyone suggest the way to accomplish this task ?
Thanks in advance ..
regards
sanjay

Hi
Our organization have a multiple branches.
we are generating the Branchwise Journal vouchers in a single batch for payroll salary .
For the same , we are using Journal Import program and runs successfully also.
Now , we want to create a Document Number for each J.V. in this single batch.
Can anyone suggest the way to accomplish this task ?
Thanks in advance ..
regards
sanjay

Similar Messages

  • Journal Import EBS R12 Issue

    Hi All,
    Environment :
    - Solaris 11
    - EBS R12
    - Oracle DB 11.2.0.3.0
    Journal Import Profile option is default (still no settings / default settings / not still tune).
    Our Journal Import could not submit more than 4000 records.
    When we try to import data more than 4000, there is an error SQL*Net message to client on OAM view log, and when I looking on LAB128 and OEM, the process stop mostly on gl.gl_je_lines, sometimes on gl.gl_je_headers.
    I had already followed suggestion from metalink forum :
    change DB : sqlnet.ora, listener.ora
    add sqlnet.ora on Apps Server (EBS R12 Server)
    But, now there is another issue, the error SQL*Net message to client not raise anymore, but when I see Journal Import OAM log, LAB128 and OEM, the sessions is inactive but the process looks like looping forever.
    Need suggestion solution for this issue.
    Any help is very appreciated and score. Thanks before.
    Best Regards,
    Yohanes Hany Haryadi Widodo
    Edited by: SigCle on May 7, 2013 2:11 AM

    Dear Mr. Hussein,
    Our EBS version : 12.1.3, while
    R12:Journal Import Failing With ORA-24337 Error When Importing All GROUP IDs [ID 1159594.1] Oracle General Ledger - Version: 12.0.0 to 12.1.2 - Release: 12.0 to 12.1
    I'm already change the settings of :
    Please revert to Oracle Net Server tracing/logging, set following parameter in the server's sqlnet.ora :
    DIAG_ADR_ENABLED = OFF
    - to back out the ADR diag for the Listener component, set following parameter in the server's listener.ora:
    DIAG_ADR_ENABLED_<listenername> = OFF
    - Where the <listenername> would be replaced with the actual name of the configured listener(s) in the listener.ora configuration file. For example, if the listener name is 'LISTENER', the parameter would read:
    DIAG_ADR_ENABLED_LISTENER = OFF
    -Reload or restart the TNS Listener for the parameter change to take effect.
    ACTION PLAN
    ============
    We will need to trace a connection on both CLIENT and SERVER endpoints to see what is happening. Please follow these steps:
    1. Add the following parameters in the sqlnet.ora file on the CLIENT workstation (where sql loader is executed):
    TRACE_LEVEL_CLIENT=16
    TRACE_DIRECTORY_CLIENT=<some_known_directory>
    TRACE_FILE_CLIENT=client
    TRACE_UNIQUE_CLIENT=ON
    TRACE_TIMESTAMP_CLIENT=ON
    DIAG_ADR_ENABLED =OFF -- add this in case of 11g client
    If you need to restrict the amount of disk space used by the long-term traces then you can also set the following:
    TRACE_FILELEN_CLIENT=<file_size_in_Kbytes>
    TRACE_FILENO_CLIENT=<number_of_files>
    2. Add the following parameters in the sqlnet.ora file on the SERVER:
    TRACE_LEVEL_SERVER=16
    TRACE_DIRECTORY_SERVER=<some_known_directory>
    TRACE_FILE_SERVER=server
    TRACE_TIMESTAMP_SERVER=ON
    DIAG_ADR_ENABLED =OFF
    If you need to restrict the amount of disk space used by the long-term traces then you can also set the following:
    TRACE_FILELEN_SERVER=<file_size_in_Kbytes>
    TRACE_FILENO_SERVER=<number_of_files>
    3. Try to reproduce the issue.
    4. Check if trace files were created.
    5. Disable tracing by removing the TRACE lines from sqlnet.ora on both CLIENT and SERVER.
    6. Compress (in .zip or .tar.gz format) and upload the trace files.
    We only need a pair of client and server trace files for the same sqlplus session which exhibits the issue; in order to match client and server trace files you should use the tips in Note:374116.1 "How to Match Oracle Net Client and Server Trace Files"below is result of failing Journal Import :
    +---------------------------------------------------------------------------+
    General Ledger: Version : 12.0.0
    Copyright (c) 1979, 1999, Oracle Corporation. All rights reserved.
    GLLEZL module: Journal Import
    +---------------------------------------------------------------------------+
    Current system time is 07-MAY-2013 19:43:26
    +---------------------------------------------------------------------------+
    gllsys() 07-MAY-2013 19:43:26
        fnd_user_id = 1164
        fnd_user_name = CN_FAH_MANAGER
        fnd_login_id = 134937
        con_request_id = 491859
        sus_on = 0
        from_date =
        to_date =
        create_summary = 1
        archive = 0
        num_rec = 25000
        run_id = 6415
    << gllsys() 07-MAY-2013 19:43:26
    SHRD0108: Retrieved 202 records from fnd_currencies
    gllcnt() 07-MAY-2013 19:43:26SHRD0118: Updated 1 record(s) in table: gl_interface_control
    source name = CN FAH Credit Card 
    interface source name = CN FAH Credit Card 
    group id = 17232
    ledger_id = -1
    LEZL0001: Found 1 sources to process.
    glluch() 07-MAY-2013 19:43:26
    << glluch() 07-MAY-2013 19:43:26
    gl_import_hook_pkg.pre_module_hook() 07-MAY-2013 19:43:26
    << gl_import_hook_pkg.pre_module_hook() 07-MAY-2013 19:43:26
    glusbe() 07-MAY-2013 19:43:26
    << glusbe() 07-MAY-2013 19:43:26
    << gllcnt() 07-MAY-2013 19:43:26
    gllacc() 07-MAY-2013 19:43:26
    << gllacc() 07-MAY-2013 19:43:26
    gllenc() 07-MAY-2013 19:43:26SHRD0108: Retrieved 6 records from gl_encumbrance_types
    << gllenc() 07-MAY-2013 19:43:26
    gllfss() 07-MAY-2013 19:43:26LEZL0005: Successfully finished building dynamic SQL statement.
    << gllfss() 07-MAY-2013 19:43:26
    gllcje() 07-MAY-2013 19:43:26
    gllalb() 07-MAY-2013 19:43:26
    << gllalb() 07-MAY-2013 19:43:26
    glllgr() 07-MAY-2013 19:43:34
    gllpst() 07-MAY-2013 19:43:34SHRD0108: Retrieved 45 records from gl_period_statuses
    << gllpst() 07-MAY-2013 19:43:34
    gllbud() 07-MAY-2013 19:43:34
    << gllbud() 07-MAY-2013 19:43:34
        currency = IDR
        sus_flag = N
        ic_flag = Y
        bc_flag = N
        latest_opened_encumbrance_year = 2011
    << glllgr() 07-MAY-2013 19:43:34
    SHRD0108: Retrieved 200 records from gl_je_categories
    << gllged() 07-MAY-2013 19:43:34
    << gllged() 07-MAY-2013 19:43:34
    << gllged() 07-MAY-2013 19:43:34
    << gllged() 07-MAY-2013 19:43:34
    << gllged() 07-MAY-2013 19:43:34
    << gllged() 07-MAY-2013 19:43:34
    << gllged() 07-MAY-2013 19:43:34
    << gllged() 07-MAY-2013 19:43:34
    << gllged() 07-MAY-2013 19:43:34
    << gllged() 07-MAY-2013 19:43:34
    << gllged() 07-MAY-2013 19:43:34
    << gllged() 07-MAY-2013 19:43:34
    << gllged() 07-MAY-2013 19:43:34
    << gllged() 07-MAY-2013 19:43:34
    << gllged() 07-MAY-2013 19:43:34
    <x gllcje() 07-MAY-2013 19:49:18
    Error in: gllcje
    Function return status: 0
    Function Err Message: Executing upd_prep using descriptor updbindda
    Function warning number: -1
    sqlcaid:   sqlabc: 0  sqlcode:  -3113  sqlerrml: 48
    sqlerrmc:
    ORA-03113: end-of-file on communication channel
    sqlerrp:       sqlerrd: 0 1 0 0 0 538976288
    sqlwarn:           sqltext: 
    *****************************************************SHRD0044: Process logging off database and exiting ...
    +---------------------------------------------------------------------------+
    Start of log messages from FND_FILE
    +---------------------------------------------------------------------------+
    +---------------------------------------------------------------------------+
    End of log messages from FND_FILE
    +---------------------------------------------------------------------------+
    ORACLE error 3114 in AFPRSR-Resubmit_Time
    Cause: AFPRSR-Resubmit_Time failed due to ORA-03114: not connected to ORACLE
    The SQL statement being executed at the time of the error was:  and was executed from the file .
    Routine FDPCLS encountered an error changing request 491859 status.
    Contact your support representative.
    ORACLE error 3114 in close_server_files
    Cause: close_server_files failed due to ORA-03114: not connected to ORACLE.
    The SQL statement being executed at the time of the error was: &SQLSTMT and was executed from the file &ERRFILE.
    ORACLE error 3114 in fetch_lines
    Cause: fetch_lines failed due to ORA-03114: not connected to ORACLE.
    The SQL statement being executed at the time of the error was: &SQLSTMT and was executed from the file &ERRFILE.
    ORACLE error 3114 in open_server_files
    Cause: open_server_files failed due to ORA-03114: not connected to ORACLE.
    The SQL statement being executed at the time of the error was: &SQLSTMT and was executed from the file &ERRFILE.
    ORACLE error 3114 in close_user_handles
    Cause: close_user_handles failed due to ORA-03114: not connected to ORACLE.
    The SQL statement being executed at the time of the error was: &SQLSTMT and was executed from the file &ERRFILE.
    ORACLE error 3114 in FDPCLS
    Cause: FDPCLS failed due to ORA-03114: not connected to ORACLE
    The SQL statement being executed at the time of the error was: lock TABLE FND_CONCURRENT_REQUESTS IN SHARE UPDATE MODE and was executed from the
    /u02/oracle/PFT/apps/apps_st/appl/gl/12.0.0/bin/GLLEZL
    Program exited with status 1
    +---------------------------------------------------------------------------+
    Executing request completion options...
    Output file size:
    0
    Finished executing request completion options.
    +---------------------------------------------------------------------------+
    Concurrent request completed
    Current system time is 07-MAY-2013 19:49:18
    +---------------------------------------------------------------------------+I don't know how ORA-03114: not connected to ORACLE could happened.
    My temporary suspect is on network issue between Apps Server and DB Server (Two-Tier) because with AutoBatch for FAH (Financial Accounting Hub), there is 5 Journal Import Processing on Request and only this one was error.
    My team had raised this issue to SR two days ago, hopefully few hours again, we could see their answer.
    Best Regards,
    Yohanes
    Edited by: SigCle on May 14, 2013 3:05 AM

  • Journal Import fails with EC12 issue

    Journal Import fails with EC12 issue, we knew the AR credit memo transactions causing this issue have distribution lines with accounted_dr value as zero and entered_dr as non zero, same is the case with accounted_cr and entered_cr.
    Issue is with the Credit memos generated out of iReceivables that are creating distributions with null amounts when applied on invoices with rules. Null amounts are on REV/UNEARN distribution lines. Applied patch# 12957348 as suggested in metalink and we noticed more number of such transactions after patch application.
    Any pointers are really appreciated. We have over 5000 such lines and is not possible to clear them manually and since these CM's are phased, we are sure to see them again with this issue till it is resolved. Note# 285995.1 didn't help either.
    We are on 11.5.10.2 and 10g database.
    K

    FOR l_rec IN (SELECT ledger_id,group_id from apps.gl_interface
    WHERE status='NEW'
    AND user_je_source_name='GIS_DATA_CONVERSION'
    GROUP BY ledger_id,group_id
    ORDER BY group_id
    LOOP
    apps.gl_journal_import_pkg.populate_interface_control (user_je_source_name => 'GIS_DATA_CONVERSION',
    GROUP_ID => l_rec.group_id,
    set_of_books_id => l_rec.ledger_id,
    interface_run_id =>vl_interface_id,
    table_name => 'GL_INTERFACE',
    processed_data_action=>'D'
    COMMIT;
    vl_request_id := apps.fnd_request.submit_request (application => 'SQLGL', -- application short name
    program => 'GLLEZLSRS', -- program short name
    description => NULL, -- program name
    start_time => NULL, -- start date
    sub_request => FALSE, -- sub-request
    argument1 => 2065, --Data access set id
    argument2 => 'GIS_DATA_CONVERSION', --Source
    argument3 => l_rec.ledger_id, -- set of books id
    argument4 => l_rec.group_id,
    argument5 => 'N', -- error to suspense flag
    argument6 => NULL, -- create summary flag
    argument7 => 'N' -- import desc flex flag
    COMMIT;
    IF ( vl_request_id = 0 ) THEN
    xxgis.gis_conv_util_pkg.debug_print_p(1,'FND_LOG','E001: Journal Import Submission Failed. ' || SQLERRM);
    retcode := 2;
    EXIT;
    ELSE
    xxgis.gis_conv_util_pkg.debug_print_p(1,'FND_LOG','P001: Submitted Journal Import Program for group id: ' || l_rec.group_id ||
    'and ledger :'||l_rec.ledger_id|| ', Request ID: ' || vl_request_id);
    END IF;
    END LOOP;

  • Payroll Journal Import Error EF04

    Hi,
    we are working in oracle applications 11i (11.5.0) TEST Instance.
    As monthly operations the payroll personnel upload the salaries journal to Gl Interface with oracle web adi. We have oracle hrms full install.
    We are trying to setup and use the direct payroll gl interface. The following steps are done so far:
    1. Cost Allocation Flexfield is setup and points to the same value sets as in the Accounting Key Flexfield.
    2. KFF qualifiers NOW are all enabled for all segments.
    3. 5 Costing accounts for 5 elements are entered directly to an employee at employee assignment element entry level.
    4. GL map to costing segments is done and although the employee is paid in foreign currency, his payroll is assigned to the functional set of book.
    5. costing is run
    6. cost break down report for costing run and only 2 lines are there! this is my first issue
    7. transfer to GL is requested .
    8. On the attempt to import the journals from payroll the request completes with warning And the two line have error code EF04
    The GL super user have checked the 5 accounts and they are valid and their combinations exist in the system.
    What do we need to check?
    Thanks ...

    Please review these docs and see if it helps.
    Journal Import Error with Status Code EF04 [ID 235029.1]
    Journal Import fails with EF04 error but security rule violation for new code combinations [ID 953109.1
    Journal Import - Record With Two Errors Ef04 And Em30 - Em30 Code Is Incorrect [ID 1175769.1]
    Receiving EF04 Error when Consolidation Run Option is Set to Run Journal Import [ID 150841.1]
    Doing a Journal Import Encounters EF04 Invalid Accounting Flexfields [ID 107896.1]
    EF04 error in GLLEZL incorrect, as code combination gets created [ID 368198.1
    Errors Caused Because of Misclassified Accounts [ID 231948.1]
    How to Use the Journal Import Correction Screen [ID 1056801.6]
    Thanks,
    Hussein

  • Oracle GL error during journal import- ICM DOWN

    Hi all,
    I hit an error regarding the Concurrent Manager - Oracle GL during the journal import from GL_INTERFACE table. I've already populate the table with a set of dummy values. i've run the journal import to import the value from the table. i've clicked on correct and search for the Journal Import Data, but Oracle Application prompted me an error - FRM-40350: Query caused no record to retrieve, So i view the request, each and every of the request ID's phase is "Pending" with its status as "Stand by". I checked on the diagnostics and the error code was as following
    CONC-DG-INACTIVE ICM DOWN
    This request may have to wait on one or more of the following requests to complete before it can begin execution:
    i have 2 doubts.firstly i was thinking could it be that i couldnt search for the Journal Import data at Journal: import - Correct was due to the Concurrent manager was down, thus prompting error FRM-40350? please advise. Secondly is that does CONC-DG-INACTIVE ICM DOWN suggest me to startup the concurrent management script? please advise
    thanks in advance
    Larry
    user572825

    that i couldnt search for the Journal Import data at Journal: import - Correct was >due to the Concurrent manager was down, thus prompting error FRM-40350?no i dont think so
    >please advise. Secondly is that does CONC-DG-INACTIVE ICM DOWN suggest >me to startup the concurrent management script?
    check fisrt if there is any runing request and wait till finish if there no runing requeste try to restart the CM if that didnot solve you issue run the CMclean.sql script then restart the CM
    fadi hasweh
    http://oracle-magic.blogspot.com/
    Oracle is not Magic, it just takes years of experience

  • Journal import fails when called from PLSQL

    Hi,
    When journal import is called from plsql code it is failing with error in 'gllacc() Function returning without value and no data found'.
    Same transaction is run succesfully from front end.
    I checked both gl_interface and gl_interface_control table but couldnt find the issue.
    Any info on this would be very helpful.
    Thanks
    Sandhya

    FOR l_rec IN (SELECT ledger_id,group_id from apps.gl_interface
    WHERE status='NEW'
    AND user_je_source_name='GIS_DATA_CONVERSION'
    GROUP BY ledger_id,group_id
    ORDER BY group_id
    LOOP
    apps.gl_journal_import_pkg.populate_interface_control (user_je_source_name => 'GIS_DATA_CONVERSION',
    GROUP_ID => l_rec.group_id,
    set_of_books_id => l_rec.ledger_id,
    interface_run_id =>vl_interface_id,
    table_name => 'GL_INTERFACE',
    processed_data_action=>'D'
    COMMIT;
    vl_request_id := apps.fnd_request.submit_request (application => 'SQLGL', -- application short name
    program => 'GLLEZLSRS', -- program short name
    description => NULL, -- program name
    start_time => NULL, -- start date
    sub_request => FALSE, -- sub-request
    argument1 => 2065, --Data access set id
    argument2 => 'GIS_DATA_CONVERSION', --Source
    argument3 => l_rec.ledger_id, -- set of books id
    argument4 => l_rec.group_id,
    argument5 => 'N', -- error to suspense flag
    argument6 => NULL, -- create summary flag
    argument7 => 'N' -- import desc flex flag
    COMMIT;
    IF ( vl_request_id = 0 ) THEN
    xxgis.gis_conv_util_pkg.debug_print_p(1,'FND_LOG','E001: Journal Import Submission Failed. ' || SQLERRM);
    retcode := 2;
    EXIT;
    ELSE
    xxgis.gis_conv_util_pkg.debug_print_p(1,'FND_LOG','P001: Submitted Journal Import Program for group id: ' || l_rec.group_id ||
    'and ledger :'||l_rec.ledger_id|| ', Request ID: ' || vl_request_id);
    END IF;
    END LOOP;

  • GL : after journal import currency conversion type change to User

    Hi
    I have following data in GL_Interface table
    User_currency_conversion-type = 'Corporate'
    Currency_conversion_date = '9/21/2008' ( Get from Gl_Daily_rates)
    Currency_conversion_rate = Null
    As per GL_Daily_Rates the conversion_rate is 1.5 for this currency to functonal currency.
    After journal import, gl_je_header is populated with
    User_currency_conversion-type = 'User'
    Currency_conversion_date = '9/21/2008'
    Currency_conversion_rate = 1
    Can you please explain, why the User_currency_conversion-type and Currency_conversion_rate are changed to above values ?
    If you have any good document of "Journal Import Rules and Logic" , please share
    Thaks in Advance

    Hello Mr. G. Lakshmipathi,
    Thanks! I'll try this function module READ_EXCHANGE_RATE.
    However, I think this issue is not really related to the exchange rate. The amount conversion from USD to JOD happens correctly based on the current pricing forumla that we are using i.e. amounts in JOD have 3 decimals after conversion from USD, but when they are stored in pricing structure, the last decimal is truncated as the currency against the 3 condition types is USD (USD has 2 decimals). Hence, the amounts have 2 deicmals.
    Thanks,
    Amit

  • Excel Spreadsheet Journal Import

    I'm trying to debug an issue related to using the Excel Spreadsheet Journal Import. Where does the validation take place? I'm using Financial 9.0. But what actually happens when you click the import button? It looks like an XML message is sent. Does anyone know if this happens and if so what the name of the message is that is supposed to handle the incoming message??
    Thanks all in advance!!

    Michael Witherell wrote:
    Kathleen,
    No, Data Merge won't do. The context of need in this case is placing an Excel spreadsheet into InDesign; with Show Import Options turned on; with the workflow idea of bringing it in unformatted; yet applying a pre-made TableStyle to it.
    It turns out in this case that you have to follow up with selecting the first row (the header row) and clicking on Table > Convert (something or other) and choosing Header, thus turning the header row into a mechanically-repeating row (as in the case of a very long table that flows over more than one page).
    You can do this in advance if it is a Word doc table. That is, it can be set in Word before placing. Apparently you cannont do so in advance in Excel.
    Therefore, upon placing, the TableStyle appears to fail to some degree until you catch up with the Header Row conversion.
    Best to you,
    Mike Witherell in Alexandria, VA
    I'm not sure if this would help:
    * Create the heading row content on the master page, either as literal text in a text frame, or as a text variable.
    * Use Data Merge multiple records per page mode on the body page.
    This creates a pseudo table heading above the merged data on each new page. You'll have to adjust the height, vertical justification, and text wrap when defining the master page text frame text wrap to force the merged data to begin below it on the page.
    If you truly need a table, perhaps because some column content needs to wrap within cells, then "never mind." But if that's not the problem, perhaps nested and GREP styles could manage the formatting you need.
    Regards,
    Peter
    Peter Gold
    KnowHow ProServices

  • Security rule whether be checked when journal import

    Hi All,
    Thanks for your attention, I have got an issue about security rule.
    When I used gl_interface to import journals into EBS system, I think the security rules will not be checked, it only check cross-validation, but I found sometimes security rule is checked for some responsibilities when journal import, why this happened, and is there any profile or setup to control it?
    Thanks for your help.
    Best Regards
    Spark

    Hi Spark,
    It looks like Journal Import doesn't check for Security rules, but it checks for cross validation rules upon dynamic insertion. Sorry for the misled earlier. You can check the metalink note, Journal Import - FAQ [ID 107059.1]. Here are the comments for the note,
    A04. Does the Journal Import process check for Cross-Validation or Security
    Rules violations?
    Journal Import does not check security rules. Transactions that come
    from Oracle subledgers (AR, AP, etc.) already have the CCID (Code
    Combination ID) in the GL_INTERFACE table. These have been validated
    in the feeder system.
    You can also populate the accounting segments directly into the
    gl_interface table and let Journal Import populate the
    code_combination_id. If dynamic insertion is enabled, and this is a
    new combination, then the import program will check for cross
    validation rule violations.
    Thanks,
    Kiran

  • Strange import issue .. no longer works in 2.6 or Beta 3

    Strange import issue.
    I use LR-2.6 everyday but have only played around with the LR-3 for a limited amount of time so the other day I decided check it out and tried to import a handful of RAW(canon) files on a SDxc-8g card via a USB card reader. I set it up to copy and convert to DNG, apply a few keywords and Metadata but when I tried it acted like it imported the first image and then froze. Card reader lights stopped blinking and just stayed on, program task bar doesn’t show any progress, not even for the first image. I then went to the folder that I was going to have the images copies to and it contained a file of the first image, but there is no indication that LR-3 even created it and it doesn’t show up in the catalog. When I tried to close out LR-3 is popped up the “in the middle of a task warning” so I just closed the program and tried it a couple more times to no avail.
    So I though o well I will just open up LR-2.6 and import them from there, guess what now LR-2.6 does the exact same thing!!
    I thought maybe my card reader was a POS but I was able to drag all the files to a new folder on my desktop and import them from there, nope. Tried rebooting a couple times, even opened a new catalog to test with, in both LR-2.6 and LR-3 and neither will work..
    I am stumped any ideas?

    Thank you for everyone's assistance
    I actually figured out what the problem was, like most issues it comes back to operator error, it seem I had inadvertently chose a "rename file on import" template that I had set up to name the files based on a custom fill-in field in the Meta Data, I normally only used that template when I was exporting images that had been completely edited, so being these images were new imports that field was blank, hence it would import the first image and then do to " don't import subjected duplicates" being checked it would just stop the import process.

  • IMPORT issues: Works with iMovie 3 but not iMovie 6 ???

    Okay, still trying to get my import issue resolved.
    I have a SONY HDR-HC3 DVmini.
    No problems with iMovie 3 on my Powerbook.
    iMovie 6 on my MDD will let me control the camera, but the screen stays BLUE with no IMPORT.
    My buddy thinks it might be a Quicktime issue.
    He said to download Quicktime Pro, that might resolve this issue.
    I can copy the iMovie 3 file to my machine with iMovie 6 and do the editing, just not the importing...
    THANX

    Hi
    Can be so much.
    Often Camera has to be connected to Charger/mains during Capture.
    An majority is a faulthy - badly connected FW-Cable (USB don't work at all)
    My list on this
    *NO CAMERA or A/D-box*
    Cable
    • Sure that You use the FireWire - USB will not work for miniDV tape Cameras
    FireWire - Sure not using the accompany USB-Cable but bought a 4-pin to 6-pin FW one ?
    • Test another FW-Cable very often the problem maker.
    Camera
    • Test Your Camera on another Mac so that DV-in still works OK
    • Toogle in iMovie pref. Play-back via Camera (on<->off some times)
    • Some Cameras has a Menu where You must select DV-out to get it to work
    • Camera connected to "charger" (mains adaptor) - not just on battery
    Does Your Camera work on another Mac ?
    Sorry to say it is to easy to turn the 6-pin end of the FW-cable 180 deg wrong.
    This is lethal to the A/D-chip in the Camera = needs an expensive repair.
    (Hard to find out - else than import/export to another Mac ceased to work
    everything else is OK eg recording and playback to TV)
    Connections
    • Daisy Chaining most often doesn’t work (some unique cases - it’s the only way that work (some Canon Cameras ?))
    Try to avoid connecting Camera <--> external HD <--> Mac but import directly to the Mac then move
    the Movie project to dedicated external hard disk.
    Mac
    • Free space on internal (start-up) hard disk ? Please specify the amount of free space.
    (Other hard disks don't count)
    I go for a minmum of 25Gb free space for 4x3 SD Video - and my guess is 5 times more for 16x9 HD ones
    after material is imported and edited.
    SoftWare
    • Delete iMovie pref file may help sometimes. I rather start a new account, log into this and have a re-try.
    • Any strange Plug-ins into QuickTime as Perian etc ? Remove and try again.
    • FileVault is off ? (hopefully)
    Using WHAT versions ? .
    • Mac OS - X.5.4 ?
    • QuickTime version ? (This is the heart in both iMovie and FinalCut)
    • iMovie 8 (7.1.?) ?
    • iMovie HD 6 (6.0.4/3) ?
    *Other ways to import Your miniDV tape*
    • Use another Camera. There where tape play-back stations from SONY
    but they costed about 2-4 times a normal miniDV Camera.
    • If Your Camera works on another Mac. Make an iMovie movie project here and move it
    over to Your Mac via an external hard disk.
    (HAS TO BE Mac OS Extended formatted - USB/DOS/FAT32/Mac OS Exchange WILL NOT DO)
    (Should be a FireWire one - USB/USB2 performs badly)
    from LKN 1935.
    Hi Bengt W, I tried it all, but nothing worked. Your answer has been helpfull insofar as all the different trials led to the conclusion that there was something wrong with my iMovie software. I therefore threw everything away and reinstalled iMovie from the HD. After that the exportation of DV videos (there has not been any problem with HDV videos) to my Sony camcorders worked properly as it did before. Thank you. LKN 1935
    from Karsten.
    in addition to Bengt's excellent '9 yards of advice' ..
    camera set to 'Play' , not rec/computer/etc.?
    camera not on battery, but power-line?
    did your Mac 'recognize' this camera before...?
    a technical check.
    connect camera, on, playback, fw-connected...
    click on the Blue Apple, upper left of your screen ..
    choose 'About../More..
    under Firewire.. what do you read..?
    More
    • FileVault - Secure that it’s turned off
    • Network storage - DOESN’T WORK
    • Where did You store/capture/import Your project ?
    External USB hard disk = Bad Choise / FireWire = Good
    If so it has to be Mac OS Extended formatted
    ----> UNIX/DOS/FAT32/Mac OS Exchange is NOT Working for VIDEO !
    mbolander
    Thanks for all your suggestions. What I learned is that I had a software problem. I had something called "Nikon Transfer" on my Mac that was recognizing my Canon camcorder as a still camera and was preventing iMovie from working properly. After uninstalling Nikon Transfer and doing a reboot, everything worked great.
    I never liked the Nikon Transfer software anyway--I guess I'll get a cheap card reader and use that to transfer photos in the future.
    *No Camera or bad import*
    • USB hard disk
    • Network storage
    • File Vault is on
    jiggaman15dg wrote
    if you have adobe cs3 or 4 and have the adobe bridge on close that
    or no firewire will work
    see if that halps
    DJ1249 wrote
    The problem was the external backup hard drive that is connected, you need to disconect the external drive before the mac can see the video camera.
    Yours Bengt W

  • Journal Import finds no records in GL_INTERFACE for processing.

    Hi,
    I'm using the Oracle Web ADI (11i) to upload journal entries form a spreedsheat to GL.
    When the request is finished, this message is shown on the out put :
    Journal Import finds no records in GL_INTERFACE for processing.
    Check SET_OF_BOOKS_ID, USER_JE_SOURCE_NAME, and GROUP_ID of import records.
    If no GROUP_ID is specified, then only data with no GROUP_ID will be retrieved.  Note that most data
    from the Oracle subledgers has a GROUP_ID, and will not be retrieved if no GROUP_ID is specified.
    Can you help me to reslove it.
    Thx.

    Hi Msk;
    You have below errors,
    LEZL0008: Found no interface records to process.
    LEZL0009: Check LEDGER_ID, GROUP_ID, and USER_JE_SOURCE_NAME of interface records.Journal Import Finds No Records in gl_interface for Processing [ID 141824.1]
    Journal Import Finds No Records in GL_INTERFACE For Processing For AX and AP Sources [ID 360994.1]
    GLMRCU does not populate GL_INTERFACE to produce journal for reporting sob [ID 1081808.6]
    Regards
    Helios

  • Journal Import References

    Hello,
    I am using the follwoing code to import a journal to GL:
    INSERT INTO GL_INTERFACE
         (STATUS, SET_OF_BOOKS_ID, USER_JE_SOURCE_NAME,
         USER_JE_CATEGORY_NAME, ACCOUNTING_DATE, CURRENCY_CODE, DATE_CREATED, CREATED_BY, ACTUAL_FLAG,
         CURRENCY_CONVERSION_DATE, USER_CURRENCY_CONVERSION_TYPE, CURRENCY_CONVERSION_RATE,
         CODE_COMBINATION_ID, ENTERED_DR, ENTERED_CR,
         REFERENCE10)
    VALUES('NEW', i.set_of_books_id, i.je_source,
    i.je_category, i.accounting_date, i.currency_code,
    SYSDATE, i.user_id,'A',
    i.accounting_date, DECODE(i.currency_code, 'USD', 'User', 'Corporate'), DECODE(i.currency_code, 'USD', 1, NULL), i.code_combination_id, i.entered_dr, i.entered_cr,
    i.description);
    The thing is that I need to put a description that appears in gl_je_headers and not only in gl_je_lines, does anyone know which column has to be filled to achieve this?
    Thanks

    Refer GL User guide. It Explains all reference columns like
    REFERENCE1 (Batch Name): Enter a batch name for your import
    batch. Journal Import creates a default batch name using the
    following format: (Optional User–Entered REFERENCE1) (Source)
    (Request ID) (Actual Flag) (Group ID). If you enter a batch name,
    Journal Import prefixes the first 50 characters of your batch name
    to the above format.
    REFERENCE2 (Batch Description): Enter a description for your
    batch. If you do not enter a batch description, Journal Import
    automatically gives your batch a description using the format:
    Journal Import (Source) (Request Id).
    REFERENCE4 (Journal entry name): Enter a journal entry name
    for your journal entry. Journal Import creates a default journal
    entry name using the following format: (Category Name)
    (Currency) (Currency Conversion Type, if applicable) (Currency
    Conversion Rate, if applicable) (Currency Conversion Date, if
    applicable) (Encumbrance Type ID, if applicable) (Budget Version
    ID, if applicable). If you enter a journal entry name, Journal Import
    prepends the first 25 characters of your journal entry name to the
    above format.
    REFERENCE5 (Journal entry description): Enter a description for
    your journal entry. If you do not enter a journal entry description,
    Journal Import automatically gives your journal entry a description
    using the format: Journal Import – Concurrent Request ID.
    REFERENCE6 (Journal entry reference): Enter a reference name or
    number for your journal entry. If you do not enter a journal entry
    reference, Journal Import automatically creates a journal entry
    reference called Journal Import Created.
    REFERENCE7 (Journal entry reversal flag): Enter Yes to mark
    your journal entry for reversal. If you do not enter Yes, Journal
    Import automatically defaults to No.
    REFERENCE9 (Journal reversal method): Enter Yes to use the
    change sign method, No to use the switch debit/credit method.
    REFERENCE10 (Journal entry line description): Enter a
    description for your journal entry line. If you do not enter a
    journal entry line description, Journal Import uses the subledger
    document sequence value. If there is no document sequence value,
    Journal Import creates a journal entry description called Journal
    Import Created.
    REFERENCE21 through REFERENCE30: Enter a reference name
    or number to further identify your import journal entry lines.
    Columns REFERENCE21 through REFERENCE30 map into
    columns REFERENCE_1 through REFERENCE_10, respectively, of
    the GL_JE_LINES table.
    REFERENCE3: Do not enter a value in this column.
    REFERENCE11 through REFERENCE20: Do not enter a value in
    this column.
    Cheers
    Sunil Chawla

  • How not to get a 'Journal Import Created' description at the Journal Entry Lines?

    For records loaded thru the Journal import,
    I always get a 'Journal Import Created' description at the Journal Entry Lines.
    Instead of this description, I want a
    more useful information which I may
    include at the GL_INTERFACE table
    possibly at the REFERENCE10 field.
    For those who will reply to this question,
    thank you very much in advance.
    null

    To populate this journal description, the Reference10 field must be populated in the GL_INTERFACE table.
    However, I have also discovered (version 11.03) that if you populate this field, but then import summarized journals, Oracle discards the value and reverts to "Journal Import Created" unless the value in the reference10 field is identical for all records to be summarized.
    null

  • AP Journal Import Error

    Hi all,
    While running process to import journal error occured. Description of error are as below:
    1) User run ACCOUNTS PAYABLE JOURNAL ENTRY EXCEPTION REPORT, run completed with no error.
    2) The follow up auto run of JOURNAL IMPORT EXECUTION REPORT was interrupted due to failure of Oracle System resulting in a shutdown of the Server. Status Indicated Journal Import Completed with Error. No output file was generated.
    3) User then follow up with GL Supervisor responsibility and when 'GL Supervisor --> Journal --> Enter' is run and posting the month is queried, no unposted posting can be found for the data imported.
    4) Subsiquent re-run of ACCOUNTS PAYABLE JOURNAL ENTRY EXCEPTION REPORT and JOURNAL IMPORT EXECUTION REPORT completed with no error. However, output indicated NO DATA FOUND.
    Below are the output for the re-run:
    Sawit - Set of Books ACCOUNTS PAYABLE JOURNAL ENTRY EXCEPTION REPORT Date: 02-APR-07 Page: 1
    Invoice Distributions with Exceptions
    |----------------Invoice------------------| |--------------------Distribution------------------| Exception
    Supplier Name Invoice Number Currency Amount Num Amount Description GL Date Code
    *** No data exists for this report ***
    Sawit - Set of Books ACCOUNTS PAYABLE JOURNAL ENTRY EXCEPTION REPORT Date: 02-APR-07 Page: 2
    Invoice Payments with Exceptions
    |----------------Payment------------------| |----------------Invoice Payment---------------| Exception
    Supplier Name Document Number Currency Amount Num Invoice Number Amount GL Date Code
    *** No data exists for this report ***
    Sawit - Set of Books ACCOUNTS PAYABLE JOURNAL ENTRY EXCEPTION REPORT Date: 02-APR-07 Page: 3
    Reconciliation Payment Distributions with Exceptions
    Supplier Name Document Number Currency Amount Line Type GL Date Exception Code
    *** No data exists for this report ***
    Sawit - Set of Books ACCOUNTS PAYABLE JOURNAL ENTRY AUDIT REPORT Date: 02-APR-07 Page: 1
    TRANSFER TO GENERAL LEDGER : POSTED INVOICES
    Currency :
    Supplier Na Invoice Num GL Date Accounting Flex Entered Debit Entered Credit Accounted Debit Accounted Credit
    *** No data exists for this report ***
    Sawit - Set of Books ACCOUNTS PAYABLE JOURNAL ENTRY AUDIT REPORT Date: 02-APR-07 Page: 2
    TRANSFER TO GENERAL LEDGER : POSTED PAYMENTS
    Bank Acccount :
    Payment Document :
    Currency :
    Supplier Na Document Num GL Date Accounting Flex Entered Debit Entered Credit Accounted Debit Accounted Credit
    *** No data exists for this report ***
    Sawit - Set of Books ACCOUNTS PAYABLE JOURNAL ENTRY AUDIT REPORT Date: 02-APR-07 Page: 3
    TRANSFER TO GENERAL LEDGER : POSTED RECONCILIATION DISTRIBUTIONS
    Bank Acccount :
    Payment Document :
    Currency :
    Supplier Na Document Num GL Date Accounting Flex Entered Debit Entered Credit Accounted Debit Accounted Credit
    *** No data exists for this report ***
    Sawit - Set of Books Journal Import Execution Report Date: 02-APR-07 08:28
    Concurrent Request ID: 706386 Page: 1
    Journal Import finds no records in gl_interface for processing.
    Check status, set_of_books_id, je_source_name and interface_run_id of import records.
    ***** End of Report *****
    5) Check done on the gl.gl_interface show no data.
    6) Check done on the gl.gl_je_batches show no such data.
    7) A Sweep of un-imported journal before closing of period is done shows no record found of unposted records.
    8) As of date, user have closed the period of Mar-07.
    The question is, how do we repost the required data for the month of Mar-07 and to allow for posting into GL using the 'GL Supervisor --> Journal --> Enter' form?
    Please help....thanks a lot..

    I would say those exceptions always happening with Java products of Cisco.
    You need to open a Cisco TAC case for them to collect logs and see in details (in programming view) why this exception is happening.
    From my experience this may take sometime because the TAC engineer usually have no idea about the programming so he forward the logs analysis to a programmer who can understand.
    If you can open a TAC case then this should be your next step.
    Good luck.
    Amjad
    P.S: The main reason of most such exceptions is bugs.

Maybe you are looking for