Bad file is not created during the external table creation.

Hello Experts,
I have created a script for external table in Oracle 10g DB. Everything is working fine except it does not create the bad file, But it creates the log file. I Cann't figure out what is the issue. Because my shell scripts is failing and the entire program is failing. I am attaching the table creation script and the shell script where it is refering and the error. Kindly let me know if something is missing. Thanks in advance
Table Creation Scripts:_-------------------------------
create table RGIS_TCA_DATA_EXT
guid VARCHAR2(250),
badge VARCHAR2(250),
scheduled_store_id VARCHAR2(250),
parent_event_id VARCHAR2(250),
event_id VARCHAR2(250),
organization_number VARCHAR2(250),
customer_number VARCHAR2(250),
store_number VARCHAR2(250),
inventory_date VARCHAR2(250),
full_name VARCHAR2(250),
punch_type VARCHAR2(250),
punch_start_date_time VARCHAR2(250),
punch_end_date_time VARCHAR2(250),
event_meet_site_id VARCHAR2(250),
vehicle_number VARCHAR2(250),
vehicle_description VARCHAR2(250),
vehicle_type VARCHAR2(250),
is_owner VARCHAR2(250),
driver_passenger VARCHAR2(250),
mileage VARCHAR2(250),
adder_code VARCHAR2(250),
bonus_qualifier_code VARCHAR2(250),
store_accuracy VARCHAR2(250),
store_length VARCHAR2(250),
badge_input_type VARCHAR2(250),
source VARCHAR2(250),
created_by VARCHAR2(250),
created_date_time VARCHAR2(250),
updated_by VARCHAR2(250),
updated_date_time VARCHAR2(250),
approver_badge_id VARCHAR2(250),
approver_name VARCHAR2(250),
orig_guid VARCHAR2(250),
edit_type VARCHAR2(250)
organization external
type ORACLE_LOADER
default directory ETIME_LOAD_DIR
access parameters
RECORDS DELIMITED BY NEWLINE
BADFILE ETIME_LOAD_DIR:'tstlms.bad'
LOGFILE ETIME_LOAD_DIR:'tstlms.log'
READSIZE 1048576
FIELDS TERMINATED BY '|'
MISSING FIELD VALUES ARE NULL(
GUID
,BADGE
,SCHEDULED_STORE_ID
,PARENT_EVENT_ID
,EVENT_ID
,ORGANIZATION_NUMBER
,CUSTOMER_NUMBER
,STORE_NUMBER
,INVENTORY_DATE char date_format date mask "YYYYMMDD HH24:MI:SS"
,FULL_NAME
,PUNCH_TYPE
,PUNCH_START_DATE_TIME char date_format date mask "YYYYMMDD HH24:MI:SS"
,PUNCH_END_DATE_TIME char date_format date mask "YYYYMMDD HH24:MI:SS"
,EVENT_MEET_SITE_ID
,VEHICLE_NUMBER
,VEHICLE_DESCRIPTION
,VEHICLE_TYPE
,IS_OWNER
,DRIVER_PASSENGER
,MILEAGE
,ADDER_CODE
,BONUS_QUALIFIER_CODE
,STORE_ACCURACY
,STORE_LENGTH
,BADGE_INPUT_TYPE
,SOURCE
,CREATED_BY
,CREATED_DATE_TIME char date_format date mask "YYYYMMDD HH24:MI:SS"
,UPDATED_BY
,UPDATED_DATE_TIME char date_format date mask "YYYYMMDD HH24:MI:SS"
,APPROVER_BADGE_ID
,APPROVER_NAME
,ORIG_GUID
,EDIT_TYPE
location (ETIME_LOAD_DIR:'tstlms.dat')
reject limit UNLIMITED;
_***Shell Script*:*----------------_*
version=1.0
umask 000
DATE=`date +%Y%m%d%H%M%S`
TIME=`date +"%H%M%S"`
SOURCE=`hostname`
fcp_login=`echo $1|awk '{print $3}'|sed 's/"//g'|awk -F= '{print $2}'`
fcp_reqid=`echo $1|awk '{print $2}'|sed 's/"//g'|awk -F= '{print $2}'`
TXT1_PATH=/home/ac1/oracle/in/tsdata
TXT2_PATH=/home/ac2/oracle/in/tsdata
ARCH1_PATH=/home/ac1/oracle/in/tsdata
ARCH2_PATH=/home/ac2/oracle/in/tsdata
DEST_PATH=/home/custom/sched/in
PROGLOG=/home/custom/sched/logs/rgis_tca_to_tlms_create.sh.log
PROGNAME=`basename $0`
PROGPATH=/home/custom/sched/scripts
cd $TXT2_PATH
FILELIST2="`ls -lrt tstlmsedits*.dat |awk '{print $9}'`"
NO_OF_FILES2="`ls -lrt tstlmsedits*.dat |awk '{print $9}'|wc -l`"
$DEST_PATH/tstlmsedits.dat for i in $FILELIST2
do
cat $i >> $DEST_PATH/tstlmsedits.dat
printf "\n" >> $DEST_PATH/tstlmsedits.dat
mv $i $i.$DATE
#mv $i $TXT2_PATH/test/.
mv $i.$DATE $TXT2_PATH/test/.
done
if test $NO_OF_FILES2 -eq 0
then
echo " no tstlmsedits.dat file exists " >> $PROGLOG
else
echo "created dat file tstlmsedits.dat at $DATE" >> $PROGLOG
echo "-------------------------------------------" >> $PROGLOG
fi
NO_OF_FILES1="`ls -lrt tstlms*.dat |awk '{print $9}'|wc -l`"
FILELIST1="`ls -lrt tstlms*.dat |awk '{print $9}'`"
$DEST_PATH/tstlms.datfor i in $FILELIST1
do
cat $i >> $DEST_PATH/tstlms.dat
printf "\n" >> $DEST_PATH/tstlms.dat
mv $i $i.$DATE
# mv $i $TXT2_PATH/test/.
mv $i.$DATE $TXT2_PATH/test/.
done
if test $NO_OF_FILES1 -eq 0
then
echo " no tstlms.dat file exists " >> $PROGLOG
else
echo "created dat file tstlms.dat at $DATE" >> $PROGLOG
fi
cd $TXT1_PATH
FILELIST3="`ls -lrt tstlmsedits*.dat |awk '{print $9}'`"
NO_OF_FILES3="`ls -lrt tstlmsedits*.dat |awk '{print $9}'|wc -l`"
$DEST_PATH/tstlmsedits.datfor i in $FILELIST3
do
cat $i >> $DEST_PATH/tstlmsedits.dat
printf "\n" >> $DEST_PATH/tstlmsedits.dat
mv $i $i.$DATE
#mv $i $TXT1_PATH/test/.
mv $i.$DATE $TXT1_PATH/test/.
done
if test $NO_OF_FILES3 -eq 0
then
echo " no tstlmsedits.dat file exists " >> $PROGLOG
else
echo "created dat file tstlmsedits.dat at $DATE" >> $PROGLOG
echo "-------------------------------------------" >> $PROGLOG
fi
NO_OF_FILES4="`ls -lrt tstlms*.dat |awk '{print $9}'|wc -l`"
FILELIST4="`ls -lrt tstlms*.dat |awk '{print $9}'`"
$DEST_PATH/tstlms.datfor i in $FILELIST4
do
cat $i >> $DEST_PATH/tstlms.dat
printf "\n" >> $DEST_PATH/tstlms.dat
mv $i $i.$DATE
# mv $i $TXT1_PATH/test/.
mv $i.$DATE $TXT1_PATH/test/.
done
if test $NO_OF_FILES4 -eq 0
then
echo " no tstlms.dat file exists " >> $PROGLOG
else
echo "created dat file tstlms.dat at $DATE" >> $PROGLOG
fi
#connecting to oracle to generate bad files
sqlplus -s $fcp_login<<EOF
select count(*) from rgis_tca_data_ext;
select count(*) from rgis_tca_data_history_ext;
exit;
EOF
#counting the records in files
tot_rec_in_tstlms=`wc -l $DEST_PATH/tstlms.dat | awk ' { print $1 } '`
tot_rec_in_tstlmsedits=`wc -l $DEST_PATH/tstlmsedits.dat | awk ' { print $1 } '`
tot_rec_in_tstlms_bad=`wc -l $DEST_PATH/tstlms.bad | awk ' { print $1 } '`
tot_rec_in_tstlmsedits_bad=`wc -l $DEST_PATH/tstlmsedits.bad | awk ' { print $1 } '`
#updating log table
echo "pl/sql block started"
sqlplus -s $fcp_login<<EOF
define tot_rec_in_tstlms     = '$tot_rec_in_tstlms';
define tot_rec_in_tstlmsedits     = '$tot_rec_in_tstlmsedits';
define tot_rec_in_tstlms_bad     = '$tot_rec_in_tstlms_bad';
define tot_rec_in_tstlmsedits_bad='$tot_rec_in_tstlmsedits_bad';
define fcp_reqid ='$fcp_reqid';
declare
l_tstlms_file_id number := null;
l_tstlmsedits_file_id number := null;
l_tot_rec_in_tstlms number := 0;
l_tot_rec_in_tstlmsedits number := 0;
l_tot_rec_in_tstlms_bad number := 0;
l_tot_rec_in_tstlmsedits_bad number := 0;
l_request_id fnd_concurrent_requests.request_id%type;
l_start_date fnd_concurrent_requests.actual_start_date%type;
l_end_date fnd_concurrent_requests.actual_completion_date%type;
l_conc_prog_name fnd_concurrent_programs.concurrent_program_name%type;
l_requested_by fnd_concurrent_requests.requested_by%type;
l_requested_date fnd_concurrent_requests.request_date%type;
begin
--getting concurrent request details
begin
SELECT fcp.concurrent_program_name,
fcr.request_id,
fcr.actual_start_date,
fcr.actual_completion_date,
fcr.requested_by,
fcr.request_date
INTO l_conc_prog_name,
l_request_id,
l_start_date,
l_end_date,
l_requested_by,
l_requested_date
FROM fnd_concurrent_requests fcr, fnd_concurrent_programs fcp
WHERE fcp.concurrent_program_id = fcr.concurrent_program_id
AND fcr.request_id = &fcp_reqid; --fnd_global.conc_request_id();
exception
when no_data_found then
fnd_file.put_line(fnd_file.log, 'Error:RGIS_TCA_TO_TLMS_CREATE.sh');
fnd_file.put_line(fnd_file.log, 'No data found for request_id');
fnd_file.put_line(fnd_file.log, sqlerrm);
raise_application_error(-20001,
'Error occured when executing RGIS_TCA_TO_TLMS_CREATE.sh ' ||
sqlerrm);
when others then
fnd_file.put_line(fnd_file.log, 'Error:RGIS_TCA_TO_TLMS_CREATE.sh');
fnd_file.put_line(fnd_file.log,
'Error occured when retrieving request_id request_id');
fnd_file.put_line(fnd_file.log, sqlerrm);
raise_application_error(-20001,
'Error occured when executing RGIS_TCA_TO_TLMS_CREATE.sh ' ||
sqlerrm);
end;
--calling ins_or_upd_tca_process_log to update log table for tstlms.dat file
begin
rgis_tca_to_tlms_process.ins_or_upd_tca_process_log
               (l_tstlms_file_id,
               'tstlms.dat',
               l_conc_prog_name,
               l_request_id,
               l_start_date,
               l_end_date,
               &tot_rec_in_tstlms,
               &tot_rec_in_tstlms_bad,
               null,
               null,               
               null,
               null,
               null,
               null,
               null,
               l_requested_by,
               l_requested_date,
               null,
               null,
               null,
               null,
               null);
exception
when others then
fnd_file.put_line(fnd_file.log, 'Error:RGIS_TCA_TO_TLMS_CREATE.sh');
fnd_file.put_line(fnd_file.log,
'Error occured when executing rgis_tca_to_tlms_process.ins_or_upd_tca_process_log for tstlms file');
fnd_file.put_line(fnd_file.log, sqlerrm);
end;
--calling ins_or_upd_tca_process_log to update log table for tstlmsedits.dat file
begin
rgis_tca_to_tlms_process.ins_or_upd_tca_process_log
               (l_tstlmsedits_file_id,
               'tstlmsedits.dat',
               l_conc_prog_name,
               l_request_id,
               l_start_date,
               l_end_date,
               &tot_rec_in_tstlmsedits,
               &tot_rec_in_tstlmsedits_bad,
               null,
               null,               
               null,
               null,
               null,
               null,
               null,
               l_requested_by,
               l_requested_date,
               null,
               null,
               null,
               null,
               null);
exception
when others then
fnd_file.put_line(fnd_file.log, 'Error:RGIS_TCA_TO_TLMS_CREATE.sh');
fnd_file.put_line(fnd_file.log,
'Error occured when executing rgis_tca_to_tlms_process.ins_or_upd_tca_process_log for tstlmsedits file');
fnd_file.put_line(fnd_file.log, sqlerrm);
end;
end;
exit;
EOF
echo "rgis_tca_to_tlms_process.sql started"
sqlplus -s $fcp_login @$SCHED_TOP/sql/rgis_tca_to_tlms_process.sql $fcp_reqid
exit;
echo "rgis_tca_to_tlms_process.sql ended"
_**Error:*----------------------------------*_
RGIS Scheduling: Version : UNKNOWN
Copyright (c) 1979, 1999, Oracle Corporation. All rights reserved.
TCATLMS module: TCA To TLMS Import Process
Current system time is 18-AUG-2011 06:13:27
COUNT(*)
     16
COUNT(*)
     25
wc: cannot open /home/custom/sched/in/tstlms.bad
wc: cannot open /home/custom/sched/in/tstlmsedits.bad
pl/sql block started
old 33:     AND fcr.request_id = &fcp_reqid; --fnd_global.conc_request_id();
new 33:     AND fcr.request_id = 18661823; --fnd_global.conc_request_id();
old 63:                &tot_rec_in_tstlms,
new 63:                16,
old 64:                &tot_rec_in_tstlms_bad,
new 64:                ,
old 97:                &tot_rec_in_tstlmsedits,
new 97:                25,
old 98:                &tot_rec_in_tstlmsedits_bad,
new 98:                ,
ERROR at line 64:
ORA-06550: line 64, column 4:
PLS-00103: Encountered the symbol "," when expecting one of the following:
( - + case mod new not null others <an identifier>
<a double-quoted delimited-identifier> <a bind variable> avg
count current exists max min prior sql stddev sum variance
execute forall merge time timestamp interval date
<a string literal with character set specification>
<a number> <a single-quoted SQL string> pipe
<an alternatively-quoted string literal with character set specification>
<an alternatively-q
ORA-06550: line 98, column 4:
PLS-00103: Encountered the symbol "," when expecting one of the following:
( - + case mod new not null others <an identifier>
<a double-quoted delimited-identifier> <a bind variable> avg
count current exists max min prior sql st
rgis_tca_to_tlms_process.sql started
old 12: and concurrent_request_id = '&1';
new 12: and concurrent_request_id = '18661823';
old 18: and concurrent_request_id = '&1';
new 18: and concurrent_request_id = '18661823';
old 22: rgis_tca_to_tlms_process.run_tca_data(l_tstlms_file_id,&1);
new 22: rgis_tca_to_tlms_process.run_tca_data(l_tstlms_file_id,18661823);
old 33: rgis_tca_to_tlms_process.run_tca_data_history(l_tstlmsedits_file_id,&1);
new 33: rgis_tca_to_tlms_process.run_tca_data_history(l_tstlmsedits_file_id,18661823);
old 44: rgis_tca_to_tlms_process.send_tca_email('TCATLMS',&1);
new 44: rgis_tca_to_tlms_process.send_tca_email('TCATLMS',18661823);
declare
ERROR at line 1:
ORA-20001: Error occured when executing RGIS_TCA_TO_TLMS_PROCESS.sql ORA-01403:
no data found
ORA-06512: at line 59
Executing request completion options...
------------- 1) PRINT   -------------
Printing output file.
Request ID : 18661823      
Number of copies : 0      
Printer : noprint
Finished executing request completion options.
Concurrent request completed successfully
Current system time is 18-AUG-2011 06:13:29
---------------------------------------------------------------------------

Hi,
Check the status of the batch in SM35 transaction.
if the batch is locked by mistake or any other error, now you can release it and aslo you can process again.
To Release -Shift+F4.
Also you can analyse the job status through F2 button.
Bye

Similar Messages

  • File is not creating on the Receiver for File Content Conversion

    Hi,
    i have created a scenario with this blog
    /people/shabarish.vijayakumar/blog/2006/02/27/content-conversion-the-key-field-problem
    for sender side i am using File Content Conversion to read .txt file and on receiver side i need xml file i have done that.
    when i placed the file in the sender folder it gets picked up and i checked in Communication Channel monitporing in Runtime WorkBench it is processed Successfully. but no XML File is created on the Receiver side. i am unable to see the messages on SXMB_MONI also.
    Please suggest some ideas to solve this.
    Thanks,
    Giridhar.C

    Hi Giridhar.
    unable to see the messages on SXMB_MONI also
    If the file got picked and in Communication Channel monitoring RWB you can see that it is processed without any errors ,, then i feel it should come in come.Please check the Audit log in RWB, there you can see the error message if any.
    Please check the connection parameters in FTP and make sure that the same sender folder is
    not being used by any other scenario.If you have done any recent changes then do  a cache refresh .
    Please post if you see any error in content conversion.
    Regards
    Srinivas

  • Batch Input file is not created for the program RFUMSV00 (VAT)

    Dear All,
    Yesterday we run the program RFUMSV00 for posting VAT entries. But program dosnt created any batch input file. We tried to create the file once again by running the program, but there were no data;s on it.
    I can able to view the spool which is created in SM37. It is in the finished status.
    Can you please advice, do I need to delete any table entry for this like BSET etc....or how I can create the batch file again by ruuning the program. I dont want to post manually through FB41 transaction.
    Is there any Standard SAP solution for this.
    Please let me know anybody need any clarification about this question.
    Thanks in advance.
    Regards,
    Raja.A
    Edited by: Raja.A on Dec 1, 2009 12:47 PM

    Hi,
    Check the status of the batch in SM35 transaction.
    if the batch is locked by mistake or any other error, now you can release it and aslo you can process again.
    To Release -Shift+F4.
    Also you can analyse the job status through F2 button.
    Bye

  • Row not creating in the results table of search page,plz help

    hi,
    i had created searchpage, and after giving the criteria ,after press the go button, its giving exception the row is not created
    oracle.apps.fnd.framework.OAException: oracle.jbo.RowCreateException: JBO-25017: Error while creating a new entity row for DeptEO.
    can anyone help regarding this.

    Please post the full error stack.
    This message is not enough to diagnose the problem.
    --Prasanna                                                                                                                                                                                                       

  • Row not creating in the results table of search page,

    hi,
    i had created searchpage, and after giving the criteria ,after press the go button, its giving exception the row is not created
    oracle.apps.fnd.framework.OAException: oracle.jbo.RowCreateException: JBO-25017: Error while creating a new entity row for DeptEO.
    can anyone help regarding this..

    Try the OA Framework Forum

  • Application does not start during the engine restart.

    Hello all,
    after deploying a custom developed application to the engine the application works fine. However, when the J2EE is restarted the application page is not available with the following message:
    Application error occurred during request processing. Details:Error
    [javax.servlet.ServletException: Spo object was not created during the
    startup], with root cause [com.appname.exception.SevereException:
    Spo object was not created during the startup].
      Exception id: [0013729689A200590000009D00000D90000446937D1802B7]
    The application is shown running from visual administrator.
    The application page is again available when we restart the application
    manually from visual admin.
    SO everytime we restart the engine we need to restart the application
    too in order to get back the application page back.
    There are many errors in the SPO log like the following:
    ERROR [SAPEngine_Application_Thread[impl:3]_25]  - Failed to set up connection pool: com.sap.engine.services.jndi.persistent.exceptions.NameNotFoundException: Path to object does not exist at java:comp, the whole lookup name is java:comp/env/jdbc/qaht6114.
    Can anyone please advice us on what could be the problem here?
    Thanks a lot in advance!
    Edited by: Rodrigo Castilhos on Mar 21, 2008 6:21 PM
    Edited by: Rodrigo Castilhos on Mar 25, 2008 6:48 PM

    Hi Parvez and Ivaylo,
    Has any of you used the startup mode setting? I tried using it and set it to always but the application still won't start. Is there anything I'm missing?
    My application-j2ee-engine.xml looks like this:
    <?xml version="1.0" encoding="UTF-8"?>
    <!DOCTYPE application-j2ee-engine SYSTEM "application-j2ee-engine.dtd">
    <application-j2ee-engine>
         <provider-name>nexeninc.com</provider-name>
         <fail-over-enable
              mode="disable"/>
         <start-up
              mode="always"/>
    </application-j2ee-engine>

  • File loaded successfully by SQLldr but external table failed

    Hi ,
    When i tried to create an external table to load data from file it failed with the below error
    "KUP-04018: partial record at end of file",
    When i query on the external table for rows with rownum<2500 i can see the records being properly loaded but when i tried to fetch all records it throws the above error.
    But the same file when i loaded with sqlldr it loaded all the records successfully.
    Can you please let me know the reason for this?
    Regards

    Can you post us your SQLLDR control file, your external table definition, a sample of your data (preferably including the last lines of data which you believe are erroring) and also let us know your database version.

  • I upgraded to Maverick and now Time Machine will not connect to the external hard drive to back up files.  The external hard drive is a Western Digital "My Book Live" 2 TB.  How do I solve problem?

    I upgraded to Maverick and now Time Machine will not connect to the external hard drive to back up files.  The external hard drive is a Western Digital "My Book Live" 2 TB.  How do I solve problem?

    I also had problem accessing my WD MyBookLive after upgrading to Maverick.
    To resolve the problem, I used  Connect To Server (Cmd K) to specify the IP address.
    Then I enter a user id and password I created in the MyBookLive.
    You can also verify that you have admin access to the MyBookLive using Safari or Chrome.
    Just enter the MyBookLive IP address as the URL.

  • I have a problem. When exporting from PDF to PPTX error "unable to process the document in the module Save As file is not created." What to do?

    I have a problem. When exporting from PDF to PPTX error "unable to process the document in the module Save As file is not created." What to do?
    Windows 7 64
    PC

    everything works on a laptop (

  • Backups of the JVM.CONFIG file are not created

    Backups of the JVM.CONFIG file are not created.
    CF10 > Server Settings > Java and JVM   in part says:
    Backups of the jvm.config file are created when you hit the submit button. You can use this backup to restore from a critical change.
    CF10 Help page indicates jvm.bak will be created:
    http://help.adobe.com/en_US/ColdFusion/10.0/Admin/WSc3ff6d0ea77859461172e0811cbf3638e6-7ff c.html#WSc3ff6d0ea77859461172e0811cbf3638e6-7feb
    Note Help indicates incorrectly JVM.CONFIG and BAK are in cf_root\runtime\bin where as JVM.CONFIG is located cf_root\instance\bin .
    Thanks in advance, Carl.

    I must have not had enough coffee - JVM.BAK is created in where JVM.CONFIG is found. Pardon me.

  • How to solve "this type of file is not supported, or the required codec is not installed"?

    Hi all,
    I've just started using a new computer with WIndows 8.1 within the last week and have loaded my Adobe Premiere Elements 11.0 software into it.
    In trying to view and edit a video I created on my old computer, I got "Media Pending" and "Media Offline" notices as soon as I began. The source files are located on an external drive with a permanent letter designation. I began the process of reconnecting my Project Assets files with the source files this morning, but have now encountered a new roadblock. Each time I try, I get the message "this type of file is not supported, or the required codec is not installed." It offers no information or options other than to click "OK," which is not very helpful.
    All the source files are located in one folder on my external drive. I have been able to reconnect the JPEGs and the mp3 source files but not the AVI video files.
    Here are three screenshots. The first is the "Media Offline" message, the second is the message I receive when I try to reconnect my Project Assets to the source files and the third is a screenshot of the codecs installed on the new computer.
    Here is my new computer information:
    Model: Del Inspiron 3847
    Intel Core i5-4460
    CPU @ 3.20 GHz  3.20 GHz
    RAM 8 GB
    System type: 64-bit, x64-based processor
    I hope someone can guide me. What do I need to do to solve this problem?
    Thank you.
    Message was edited by: Janie Christensen Ficara

    StreetSongs
    Thank your for the information supplied.
    But, what are the properties of this .avi file that you seek?
    Video Compression
    Audio Compression
    Frame Size
    Frame Rate
    Interlaced or Progressive
    Pixel Aspect Ratio
    Right now of prime interest is the video compression that is being wrapped by that AVI wrapper/container.
    AVCHD.avi
    MotionJPEG.avi
    XviD.avi
    DviX.avi
    other
    There are instances were a particular video compression or a particular container/wrapper format is each supported by Premiere Elements, but
    not a combo of those particular two. But, in this instance, I suspect this to be a MotionJPEG video codec issue, but your details will
    point us to the actual situation, and we will plan troubleshooting strategy from there.
    ATR

  • Video Sidecar Files (.THM) Not Created When Video Exported As Original - They should be.

    When exporting video as in it's original format the .THM sidecar file (which contains all the relavent metadata for the video) should be included so that other programs (or even Lightroom itself if using the Add to This Catalog option during export) can have access to the metadata for the video.
    Here is my use case:
    After importing the original video clips I would like to do the following to keep my footage clean and conserve space (as we all know HD video eats a LOT of space)
    1. Trim excess / unneeded footage from the clip
    2. Export the clip in it's original format with a new name (such as MVI_1234_trimmed.mov) and have it added back to the catalog during the export process.
    3. Delete the original clip containing the unneeded footage.
    This workflow is almost perfect in the initial LR 4 beta release except that a copy of the .THM file is not included when the video is exported.  This means the new clip will not contain any metadata when added to LR or presumabley other applications.  I can manuall get around this by duplicating the original .THM file and renaming the duplicate to match the name of the new trimmed clip (example MVI_1234_trimmed.THM) then manually importing into Lightroom.  However this breaks the almost perfect workflow and increases time and room for error.  It would no longer be an automated process which could easily be applied to clips in bulk.
    Please have LR create the sidecar file with when exporting in original format or at least have an option for it!

    The proposed workflow may not be valid based on the discussion here: http://forums.adobe.com/thread/947245?tstart=0 but the sidecar file point still is.  Exporting as H.264 does not preserve metadata either.  LR shows no camera / exposure / date time information for the H.264 version.

  • Cannot add hub-managed content type with external list lookup columns to a list -- Error:Id field is not set on the external data field

    This is a variation on the issue mentioned in this
    post
    We are using SP 2010 Content Hub to manage our content types.  On the content hub we've created a couple of exteranl lists, and then created some site columns as lookups against these lists.  We then added the columns to one of our content types
    and set it to publish.
    After the publishing job executed, I tried adding the content type (which now appears on the subscriber sites) to one of the document libraries on one of the subscriber sites.  When I did that it threw the following error:
    Microsoft.SharePoint.WebControls.BusinessDataListConfigurationException: Id field is not set on the external data field    
    at Microsoft.SharePoint.SPBusinessDataField.CreateIdField(SPAddFieldOptions op)     
    at Microsoft.SharePoint.SPBusinessDataField.OnAdded(SPAddFieldOptions op)     
    at Microsoft.SharePoint.SPFieldCollection.AddFieldAsXmlInternal(String schemaXml, Boolean addToDefaultView, SPAddFieldOptions op, Boolean isMigration, Boolean fResetCTCol)     
    at Microsoft.SharePoint.SPContentType.ProvisionFieldOnList(SPField field, Boolean bRecurAllowed)     
    at Microsoft.SharePoint.SPContentType.ProvisionFieldsOnList()     
    at Microsoft.SharePoint.SPContentType.DeriveContentType(SPContentTypeCollection cts, SPContentType& ctNew)     
    at Microsoft.SharePoint.SPContentTypeCollection.AddContentTypeToList(SPContentType contentType)     
    at Microsoft.SharePoint.SPContentTypeCollection.AddContentType(SPContentType contentType, Boolean updateResourceFileProperty, Boolean checkName, Boolean setNextChildByte)     
    at Microsoft.SharePoint.SPContentTypeCollection.Add(SPContentType contentType)     
    at Microsoft.SharePoint.ApplicationPages.AddContentTypeToListPage.Update(Object o, EventArgs e)     
    at System.Web.UI.WebControls.Button.OnClick(EventArgs e)     
    at System.Web.UI.WebControls.Button.RaisePostBackEvent(String eventArgument)     
    at System.Web.UI.Page.RaisePostBackEvent(IPostBackEventHandler sourceControl, String eventArgument)     
    at System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint)    b55297ed-717f-466d-8bdc-297b20344d3f
    I checked the external  content type configuration and it did specify an "id column".  Anyone know if what I am attempting to do is possible and if so, what special configurations are required?
    Thanks

    The issue is not External Content type or external list but the look up column.
    It's not possible to publish a look up column via the Content Type Hub.
    If you need to do this then an alternate way is to use a Managed Metadata column instead, otherwise you will have to implement this via a feature.
    Varun Malhotra
    =================
    If my post solves your problem could you mark the post as Answered or Vote As Helpful if my post has been helpful for you.

  • RW-00023: Error:  DBC file was not created- When try to install 11i on OEL5

    I got an erorr in the appl-top log file as below, also my installation wizard showing post installation checks did not succeed. Please advice.....:-)
    DBC File Check
    RW-00023: Error: - DBC file was not created:
    File = /d01/oracle/visappl/fnd/11.5.0/secure/VIS_oracleapp1/vis.dbc
    HTTP Check
    checking URL = http://oracleapp1.srini.co.uk:8000
    RW-50015: Error: - Portal is not responding. The service might not have started on the port yet. Please check the service and use the retry button.
    RW-50015: Error: - Portal is not responding. The service might not have started on the port yet. Please check the service and use the retry button.
    JSP Check
    checking URL = http://oracleapp1.srini.co.uk:8000/OA_HTML/jsp/fnd/fndhelp.jsp?dbc=/d01/oracle/visappl/fnd/11.5.0/secure/VIS_oracleapp1/vis.dbc
    RW-50015: Error: - JSP is not responding. The service might not have started on the port yet. Please check the service and use the retry button.
    PHP Check
    checking URL = http://oracleapp1.srini.co.uk:8000/OA_HTML/US/ICXINDEX.htm
    RW-50015: Error: - Login Page is not responding. The service might not have started on the port yet. Please check the service and use the retry button.
    JSP Check
    checking URL = http://oracleapp1.srini.co.uk:8000/OA_HTML/jsp/fnd/fndhelp.jsp?dbc=/d01/oracle/visappl/fnd/11.5.0/secure/VIS_oracleapp1/vis.dbc
    RW-50015: Error: - JSP is not responding. The service might not have started on the port yet. Please check the service and use the retry button.
    PHP Check
    checking URL = http://oracleapp1.srini.co.uk:8000/OA_HTML/US/ICXINDEX.htm
    RW-50015: Error: - Login Page is not responding. The service might not have started on the port yet. Please check the service and use the retry button.

    Hi JD,
    I am able to fix this issue.....:-) Thanks for your help JD.
    As per the Metalink note 747424.1, i replaced all occurances of LD_ASSUME_KERNEL with XD_ASSUME_KERNEL only in DB-tier, but not in the Apps tier. So this was causing the
    above issue in the instalation of oracle 11i on OEL5
    --Once again i alter and replaced all occurances of LD_ASSUME_KERNEL with XD_ASSUME_KERNEL in both the files....
    (db tier)
    <DB_HOME>/appsutil/bin/adgetlnxver.sh (did it earlier)
    (apps tier)
    <APPL_TOP>/ad/11.5.0/bin/adgetlnxver.sh
    --run  the autoconfig in apps tier with completed successfully result (and also generated the  DBC files)
    --run the adstrtal.sh and retry the rapidwiz once more. ( Oracle Discoverer services VIS_oracleapp1 - addisctl.sh - this one is Failed)
    --This time everything checked OK in the instalation wizard.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • Trying to use a new, larger external hard drive for my Time Machine backup.  However, every time I start the backup, it gets started then fails.  And, I can't delete the few files that did save on the external.  Sort of a catch 22.  Any ideas?

    Trying to use a new, larger external hard drive for my Time Machine backup.  However, every time I start the backup, it gets started then fails.  And, I can't delete the few files that did save on the external.  Sort of a catch 22.  Any ideas?

    Is it a USB hard drive?  USB hard drives have the problem of not giving full speed if they are hooked up on the same bus as keyboards and mice.  Double check your profiler to make sure that is not a problem.  If it is Firewire, make sure there aren't other firewire devices in use at the same time.  I recommend not only keeping a Time Machine backup, but also a clone, and if you do use Time Machine, to make sure the Time Machine drive or partition is at least twice the size of the original drive.

Maybe you are looking for

  • Accessing OID in a Linux Environment

    Our admin username "portal" password expired, and we need to change the expiration date in our Oracle Directory Manager/Oracle Internet Directory (OID). We are familiar with accessing OID in a Windows environment, but not sure how to do so in a Linux

  • What is the standard class used to create SALES ORDER in SAP CRM?

    Hello Experts, Can anyone suggest me what is the standard class used for creating sales order. I have created sales order using the BAPI 'BAPI_SLSTRANSACT_CREATEMULTI' in my report program. Now, I have to create sales order using standard classes and

  • How do I add copyrights and legal info in Robohelp 9?

    I am creating Context Sensitve help in Robohelp 9.  Is there a way to add copyrights and legal info?

  • Local Security Policies not getting applied

    Hi, We have a Windows 2012 Server which is added to Domain. We have requirement for applying some security settings on the servers. We do not want to use Group Policies for the same as we have different server in different OU's. We have applied the p

  • Only plays part of first song.

    i have everything updated and everything but after the first part of a song it goes back to home screan