Track flat files that failed loading in sql loader
Hi,
Can anyone please suggest me any way to track the flat files which failed while loading in sql loader. And pass this failed flat file name to any column of a database table
Thanks in advance.
Edited by: 806821 on Nov 2, 2010 10:22 AM
Hi Morgan thnannks for ur reply.
Define failed. 1 row not loaded ... no rows not loaded ... what operating system ... what version of the Oracle database ... track in a table, send an email?
Your inquiry is long on generalities and short on specifics: Fill in all the blanks ... not just the ones I've listed above.
even if 1 row is not loaded even then it should be considered failed to load and the name of the that particular flat file should be fetched.
Operating system is unix
Oracle database we are using is R12
track in a table , yeah we want to send an email notificaiton whenever the flat files fails to load.
Thanks once again...!!
Similar Messages
-
Loading from multiple flat files to same table using SQL Loader
Hi Gurus,
Can anyone please brief me the pros and cons of kicking of multiple sql loader sessions that reads multiple flat files but inserting it into just one table.
The table is not partitioned. Avg record counts for each flat file is about 5-6 million.
Oracle 11g,
OS: Linux
Regards
Cherrish VaidiyanVaidiyan wrote:
Hi Gurus,
Can anyone please brief me the pros and cons of kicking of multiple sql loader sessions that reads multiple flat files but inserting it into just one table.Cherrish,
Pros -> Faster loading of more data
Cons -> Potential performance degradation
Test to see how much resource consuming this task would be and do a priority comparison of that multi-multi load task with other stuff that will be happening in the database in the multi-multi load time so you could decide how to share resources in that time. -
How to Load Multiple Files in Oracle Database using Sql Loader
Hi All,
I want to import multiple files in my DB using Sql*Loader. Please tell me the Syntax, how can I import multiple files using one Control File?
Thanks & Regards,
ImranHi,
You might get a good response to your post in the forum dedicated to data movement , including SQL*Loader . You can find it here Export/Import/SQL Loader & External Tables
Regards, -
Comparison of Data Loading techniques - Sql Loader & External Tables
Below are 2 techniques using which the data can be loaded from Flat files to oracle tables.
1) SQL Loader:
a. Place the flat file( .txt or .csv) on the desired Location.
b. Create a control file
Load Data
Infile "Mytextfile.txt" (-- file containing table data , specify paths correctly, it could be .csv as well)
Append or Truncate (-- based on requirement) into oracle tablename
Separated by "," (or the delimiter we use in input file) optionally enclosed by
(Field1, field2, field3 etc)
c. Now run sqlldr utility of oracle on sql command prompt as
sqlldr username/password .CTL filename
d. The data can be verified by selecting the data from the table.
Select * from oracle_table;
2) External Table:
a. Place the flat file (.txt or .csv) on the desired location.
abc.csv
1,one,first
2,two,second
3,three,third
4,four,fourth
b. Create a directory
create or replace directory ext_dir as '/home/rene/ext_dir'; -- path where the source file is kept
c. After granting appropriate permissions to the user, we can create external table like below.
create table ext_table_csv (
i Number,
n Varchar2(20),
m Varchar2(20)
organization external (
type oracle_loader
default directory ext_dir
access parameters (
records delimited by newline
fields terminated by ','
missing field values are null
location ('file.csv')
reject limit unlimited;
d. Verify data by selecting it from the external table now
select * from ext_table_csv;
External tables feature is a complement to existing SQL*Loader functionality.
It allows you to –
• Access data in external sources as if it were in a table in the database.
• Merge a flat file with an existing table in one statement.
• Sort a flat file on the way into a table you want compressed nicely
• Do a parallel direct path load -- without splitting up the input file, writing
Shortcomings:
• External tables are read-only.
• No data manipulation language (DML) operations or index creation is allowed on an external table.
Using Sql Loader You can –
• Load the data from a stored procedure or trigger (insert is not sqlldr)
• Do multi-table inserts
• Flow the data through a pipelined plsql function for cleansing/transformation
Comparison for data loading
To make the loading operation faster, the degree of parallelism can be set to any number, e.g 4
So, when you created the external table, the database will divide the file to be read by four processes running in parallel. This parallelism happens automatically, with no additional effort on your part, and is really quite convenient. To parallelize this load using SQL*Loader, you would have had to manually divide your input file into multiple smaller files.
Conclusion:
SQL*Loader may be the better choice in data loading situations that require additional indexing of the staging table. However, we can always copy the data from external tables to Oracle Tables using DB links.Please let me know your views on this.
-
Need faster data loading (using sql-loader)
i am trying to load approx. 230 million records (around 60-bytes per record) from a flat file into a single table. i have tried sql-loader (conventional load) and i'm seeing performance degrade as the file is being processed. i am avoiding direct path sql-loading because i need to maintain uniqueness using my primary key index during the load. so the degradation of the load performance doesn't shock me. the source data file contains duplicate records and may contain records that are duplicates of those that are already in the table (i am appending during sql-loader).
my other option is to unload the entire table to a flat file, concatenate the new file onto it, run it through a unique sort, and then direct-path load it.
has anyone had a similar experience? any cool solutions available that are quick?
thanks,
jeffIt would be faster to suck into a Oracle table, call it a temporary table and then make a final move into the final table.
This way you could direct load into an oracle table then you could
INSERT /*+ APPEND */ INTO Final_Table
SELECT DISTINCT *
FROM Temp_Table
ORDER BY ID;This would do a 'direct load' type move from your temp teable to the final table, which automatically merging the duplicate records;
So
1) Direct Load from SQL*Loader into temp table.
2) Place index (non-unique) on temp table column ID.
3) Direct load INSERT into the final table.
Step 2 may make this process faster or slower, only testing will tell.
Good Luck,
Eric Kamradt -
SQL Loader error: SQL*Loader-926. Please help
Hi,
While loading some files to my database table, I am getting the following error. I am using 'Truncate' option while loading the file:
Error:
====
SQL*Loader-926: OCI error while executing delete/truncate (due to REPLACE/TRUNCATE keyword) for table LOS_STAGE_DS4
ORA-01426: numeric overflow
Here's the loader properties(excerpts from load log)
================================
SQL*Loader: Release 11.1.0.6.0 - Production on Fri Nov 26 04:54:18 2010
Copyright (c) 1982, 2007, Oracle. All rights reserved.
Control File: d:\Prod\rent_Load\Bin\rent_Load.ctl
Data File: d:\Prod\rent_Load\Data\rent.704
Bad File: d:\Prod\rent_Load\Bad\rent.704
Discard File: none specified
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 1000000000
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table LS_STAGE, loaded from every logical record.
Insert option in effect for this table: TRUNCATE
Column Name Position Len Term Encl Datatype
Could someone please help and advise what is the root cause of this error?
Thanks,The root cause is in the error ora-1426, which you can look up in the online error documentation at http://tahiti.oracle.com . No one knows every error message by heart. This means it is expected you look up the error prior to posting, and you don't expect any volunteer in this forum to look up the error on your behalf.
Also this is a typical candidate for being a known problem, and known problems can be found on My Oracle Support.
Sybrand Bakker
Senior Oracle DBA -
Check data before loading through SQL *Loader
Hi all,
I have a temp table which is loaded through SQL*Loader.This table is used by a procedure for inserting data into another table.
I get error of 0RA-01722 frequently during procdures execution.
I have decided to check for the error data through the control file itself.
I have few doubts about SQL Loader.
Will a record containing character data for column declared as INTEGER EXTERNAL in ctrl file get discarded?
Does declaring column as INTERGER EXTERNAL take care of NULL values?
Does a whole record gets discarded if one of the column data is misplaced in the record in input file?
Control File is of following format:
LOAD DATA
APPEND INTO TABLE Temp
FIELDS TERMINATED BY "|" optionally enclosed by "'"
trailing nullcols
FILEDATE DATE 'DD/MM/YYYY',
ACC_NUM INTEGER EXTERNAL,
REC_TYPE ,
LOGO , (Data:Numeric Declared:VARCHAR)
CARD_NUM INTEGER EXTERNAL,
ACTION_DATE DATE 'DD/MM/YYYY',
EFFECTIVE_DATE DATE 'DD/MM/YYYY',
ACTION_AMOUNT , (Data:Numeric Declared:NUMBER)
ACTION_STORE , (Data:Numeric Declared:VARCHAR)
ACTION_AUTH_NUM ,
ACTION_SKU_NUM ,
ACTION_CASE_NUM )
What changes do I need to make in this file regarding above questions?Is there any online document for this?<br>
Here it is -
Why no exclusive lock for conventional path loading for SQL*Loader?
why no exclusive lock for conventional path loading for SQL*Loader?
it use insert statement so it should use exclusive lock right?
thanksok, so only update statement would put a lock but not for insert statement?
because I have seen a situation where a user update rows in a sesssion (without commit) prevent another user update the rows.
thanks -
Load XML File into temporary tables using sql loader
Hi All,
I have an XML file as below. I need to insert the contents into a temporary staging table using sql loader. Please advice how I need to do that.
For example Portfolios should go into a seperate table, and all the tags inside it should be populated in the columns of the table.
Family should go into a seperate table and all the tags inside it should be populated in the columns of the table.
Similarly offer, Products etc.
- <ABSProductCatalog xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
- <ProductSalesHierachy>
- <Portfolios>
- <Portfolio productCode="P1">
<Attribute name="CatalogProductName" value="Access" />
<Attribute name="Status" value="Active" />
</Portfolio>
- <Portfolio productCode="P2">
<Attribute name="CatalogProductName" value="Data" />
<Attribute name="Status" value="Active" />
</Portfolio>
- <Portfolio productCode="P3">
<Attribute name="CatalogProductName" value="Voice" />
<Attribute name="Status" value="Active" />
</Portfolio>
- <Portfolio productCode="P4">
<Attribute name="CatalogProductName" value="Wireless" />
<Attribute name="Status" value="Active" />
</Portfolio>
</Portfolios>
- <Families>
- <Family productCode="F1">
<Attribute name="CatalogProductName" value="Internet Access Services" />
<Attribute name="Status" value="Active" />
- <ParentHierarchy>
<Item productCode="P1" modelType="Portfolio" />
</ParentHierarchy>
</Family>
- <Family productCode="F2">
<Attribute name="CatalogProductName" value="Local Access Services" />
<Attribute name="Status" value="Active" />
- <ParentHierarchy>
<Item productCode="P2" modelType="Portfolio" />
</ParentHierarchy>
</Family>
</Families>
- <SubFamilies>
- <SubFamily productCode="SF1">
<Attribute name="CatalogProductName" value="Business Internet service" />
<Attribute name="Status" value="Active" />
- <ParentHierarchy>
<Item productCode="F1" modelType="Family" />
</ParentHierarchy>
</SubFamily>
</SubFamilies>
- <ProductRefs>
- <ProductRef productCode="WSP1" modelType="Wireline Sales Product">
<ActiveFlag>Y</ActiveFlag>
- <ProductHierarchy>
<SalesHierarchy family="F1" subFamily="SF1" portfolio="P1" primary="Y" />
<SalesHierarchy family="F2" portfolio="P2" primary="N" />
<FinancialHierarchy quotaBucket="Voice" strategicProdCategory="Local Voice" />
</ProductHierarchy>
</ProductRef>
- <ProductRef productCode="MSP2" modelType="Handset">
<ActiveFlag>Y</ActiveFlag>
- <ProductHierarchy>
<SalesHierarchy portfolio="P4" primary="Y" />
</ProductHierarchy>
</ProductRef>
</ProductRefs>
</ProductSalesHierachy>
- <Offers>
- <Offer productCode="ABN">
<OfferName>ABN</OfferName>
<OfferDescription>ABN Description</OfferDescription>
- <Segments>
<Segment>SCG</Segment>
<Segment>PCG</Segment>
</Segments>
<OfferUpdateDate>2009-11-20</OfferUpdateDate>
<ActiveFlag>Y</ActiveFlag>
</Offer>
- <Offer productCode="OneNet">
<OfferName>OneNet</OfferName>
<OfferDescription>OneNet Description</OfferDescription>
- <Segments>
<Segment>SCG</Segment>
<Segment>PCG</Segment>
<Segment>PCG2</Segment>
</Segments>
<OfferUpdateDate>2009-11-20</OfferUpdateDate>
<ActiveFlag>Y</ActiveFlag>
</Offer>
</Offers>
- <Products>
- <Product productCode="WSP1" modelType="Wireline Sales Product">
<ProductName>AT&T High Speed Internet</ProductName>
<ProductDescription>High Speed Internet</ProductDescription>
<LegacyCoProdIndicator>SBC</LegacyCoProdIndicator>
<RevenueCBLCode>1234B</RevenueCBLCode>
<VolumeCBLCode>4567A</VolumeCBLCode>
<SAARTServiceIDCode>S1234</SAARTServiceIDCode>
<MarginPercentRequired>Y</MarginPercentRequired>
<PercentIntl>%234</PercentIntl>
<UOM>Each</UOM>
<PriceType>OneTime</PriceType>
<ProductStatus>Active</ProductStatus>
<Compensable>Y</Compensable>
<Jurisdiction>Everywhere</Jurisdiction>
<ActiveFlag>Y</ActiveFlag>
- <Availabilities>
<Availability>SE</Availability>
<Availability>E</Availability>
</Availabilities>
- <Segments>
<Segment>SCG</Segment>
<Segment>PCG</Segment>
</Segments>
<VDIndicator>Voice</VDIndicator>
<PSOCCode>PSOC 1</PSOCCode>
<USBilled>Y</USBilled>
<MOWBilled>N</MOWBilled>
<ProductStartDate>2009-11-20</ProductStartDate>
<ProductUpdateDate>2009-11-20</ProductUpdateDate>
<ProductEndDate>2010-11-20</ProductEndDate>
- <AliasNames>
<AliasName>AT&T HSI</AliasName>
<AliasName>AT&T Fast Internet</AliasName>
</AliasNames>
- <OfferTypes>
<OfferType productCode="ABN" endDate="2009-11-20" />
<OfferType productCode="OneNet" />
</OfferTypes>
- <DynamicAttributes>
- <DynamicAttribute dataType="String" defaultValue="2.5 Mbps" name="Speed">
<AttrValue>1.5 Mbps</AttrValue>
<AttrValue>2.5 Mbps</AttrValue>
<AttrValue>3.5 Mbps</AttrValue>
</DynamicAttribute>
- <DynamicAttribute dataType="String" name="TransportType">
<AttrValue>T1</AttrValue>
</DynamicAttribute>
</DynamicAttributes>
</Product>
- <Product productCode="MSP2" modelType="Handset">
<ProductName>Blackberry Bold</ProductName>
<ProductDescription>Blackberry Bold Phone</ProductDescription>
<LegacyCoProdIndicator />
<RevenueCBLCode />
<VolumeCBLCode />
<SAARTServiceIDCode />
<MarginPercentRequired />
<PercentIntl />
<UOM>Each</UOM>
<PriceType />
<ProductStatus>Active</ProductStatus>
<Compensable />
<Jurisdiction />
<ActiveFlag>Y</ActiveFlag>
- <Availabilities>
<Availability />
</Availabilities>
- <Segments>
<Segment>SCG</Segment>
<Segment>PCG</Segment>
</Segments>
<VDIndicator>Voice</VDIndicator>
<PSOCCode />
<USBilled />
<MOWBilled />
<ProductStartDate>2009-11-20</ProductStartDate>
<ProductUpdateDate>2009-11-20</ProductUpdateDate>
- <AliasNames>
<AliasName />
</AliasNames>
- <OfferTypes>
<OfferType productCode="ABN" />
</OfferTypes>
- <DynamicAttributes>
- <DynamicAttribute dataType="String" name="StlmntContractType">
<AttrValue />
</DynamicAttribute>
- <DynamicAttribute dataType="String" name="BMG 2 year price">
<AttrValue>20</AttrValue>
</DynamicAttribute>
- <DynamicAttribute dataType="String" name="MSRP">
<AttrValue>40</AttrValue>
</DynamicAttribute>
- <DynamicAttribute dataType="String" name="BMGAvailableType">
<AttrValue />
</DynamicAttribute>
- <DynamicAttribute dataType="String" name="ProductId">
<AttrValue>123456</AttrValue>
</DynamicAttribute>
- <DynamicAttribute dataType="String" name="modelSource">
<AttrValue>product</AttrValue>
</DynamicAttribute>
</DynamicAttributes>
</Product>
</Products>
<CatalogChanged>Y</CatalogChanged>
</ABSProductCatalog>Two options that come to mind. Others exist.
#1 - {thread:id=474031}, which is basically storing the XML in an Object Relational structure for parsing
#2 - Dump the XML into either an XMLType based table or column and use SQL (with XMLTable) to create a view that parses the data. This would be the same as the view shown in the above post.
Don't use sql*loader to parse the XML. I was trying to find a post from mdrake about that but couldn't. In short, sql*loader was not build as an XML parser so don't try to use it that way. -
Loading of flat file (csv) into PSA – no data loaded
Hi BW-gurus,
We have an issue regarding loading a flat file (csv) into PSA using an infopackage u2013 (BI 7.0)
The infopackage has been used for a while. Prior the consultants with SAP_ALL-profile have run the infopackage. Now we want a few super users to run the infopackage.
We have created a role for the super users, including authorization objects:
Data Warehousing objects: S_RS_ADMWB
Activity: 03, 16, 23, 63, 66
Data Warehousing Workbench obj: INFOAREA, INFOOBJECT, INFOPACKAG, MONITOR, SOURCESYS, WORKBENCH
Data Warehousing Workbench u2013 datasource (version > BW 3.x): S_RS_DS
Activity: All
Datasource: All
Subobject for New DataSource: All
Sourcesystem: FILE
Data Warehousing Workbench u2013 infosource (flex update): S_RS_ISOUR
Activity: Display, Maintain, Request
Application Component: All
InfoSource: All
InfoSource Subobject: All values
As mentioned, the infopackage in question, has been used by consultants with SAP_ALL-profile for some time, and been working just fine. When the super users with the new role are executing the infopackage, the records are found, but not loaded into PSA. The load seems to be stuck, but no error message occurs. The file we are trying to load contains only 15 records.
Details monitor:
Overall status: Missing messages or warnings (yellow)
Requests (messages): Everything ok (green)
-> Data request arranged (green)
-> Confirmed with: OK (green)
Extraction (messages): Errors occurred (yellow)
-> Data request received (green)
-> Data selection scheduled (green)
-> 15 Records sent (0 Records received) (yellow)
-> Data selection ended (green)
Transfer (IDocs and TRFC): Missing messages (yellow)
Processing (data packet): Warnings received (yellow)
-> Data package 1 (? Records): Missing messages (yellow)
-> Inbound processing (0 records): Missing messages (yellow)
-> Update PSA (0 Records posted): Missing messages (yellow)
-> Processing end: Missing messages (yellow)
Have we forgotten something? Any assistance will be highly appreciated!
Cheers,
Anne Therese S. JohannessenHi,
Try to use the transaction ST01 to trace the authorization of the upload with the SAP_ALL.
And the enhance your Profile for the super user.
Best regards
Matthias -
Flat File reconciliation failing with no error
Hello,
I'm trying to set up flat file reconciliation with OIM 11g. I've followed this guide
http://www.oracle.com/webfolder/technetwork/tutorials/obe/fmw/oim/10.1.4/oim/obe12_using_gtc_for_reconciliation/using_the_gtc.htm
and configured the mapping for my fields. When I execute the scheduled task to launch the reconciliation process, I get no errors, but no user is created either. In OIM's output, the only message relevant to the reconciliation that is printed is
<Sep 7, 2012 3:12:44 PM EDT> <Warning> <XELLERATE.GC.PROVIDER.RECONCILIATIONTRANSPORT> <BEA-000000> <FILE SUCCESSFULLY ARCHIVED : /u01/flat_files/hcm_1.txt>
In OIM's logs, there are a bunch of notifications about the values retrieved from the flat file, and the logs end with this :
[2012-09-07T15:12:44.511-04:00] [oim_server1] [NOTIFICATION] [IAM-5010000] [oracle.iam.reconciliation.impl] [tid: [ACTIVE].ExecuteThread: '3' for queue: 'weblogic.kernel.Default (self-tuning)'] [userId: oiminternal] [ecid: 11d1def534ea1be0:315618d7:139a11a9ada:-8000-0000000000000002,0] [APP: oim#11.1.1.3.0] Generic Information: Query for getTargetTableEntity=select RECON_CITY,RECON_USR_MANAGER_KEY,RECON_ORG_NAME,RECON_EMPLOYEEID,RECON_DEPARTMENTID,RECON_HIREDATE,RECON_ADDRESS,RECON_MANAGER_LOGIN,RECON_DISPLAYNAME66894369,RECON_CHGLOGATTR_IDXLST,RECON_USR_END_DATE,RECON_PHONE,RECON_ACT_KEY,RECON_COUNTRY,RECON_USR_LOGIN,RECON_USR_TYPE,re_key,RECON_POSTAL,RECON_STATE,RECON_USR_EMAIL,RECON_USR_EMP_TYPE,RECON_USR_PASSWORD,RECON_LASTNAME,RECON_USR_START_DATE,RECON_FIRSTNAME from RA_MOCKPSHCMGTC85 where EXISTS (select re_key from recon_events where rb_key=6 and recon_events.re_key=RA_MOCKPSHCMGTC85.re_key)
[2012-09-07T15:12:44.517-04:00] [oim_server1] [NOTIFICATION] [IAM-5010000] [oracle.iam.reconciliation.impl] [tid: [ACTIVE].ExecuteThread: '3' for queue: 'weblogic.kernel.Default (self-tuning)'] [userId: oiminternal] [ecid: 11d1def534ea1be0:315618d7:139a11a9ada:-8000-0000000000000002,0] [APP: oim#11.1.1.3.0] Generic Information: getTargetTableEntity =[{usr_manager_key=null, Locality Name=St-glinglin3, Organization Name=Xellerate Users, Employee Number=c2345678, Department Number=01, Hire Date=Sat Dec 31 00:00:00 EST 2011, Manager Login=null, Home Postal Address=2021 du fin fin3, Display Name=c2345678, RECON_CHGLOGATTR_IDXLST=4,1,6,10,8,9,14,2,13,12,3,15,7,5,11, End Date=null, Home Phone=514-555-1234, act_key=1, Country=Mozambique3, User Login=c2345678, Xellerate Type=null, re_key=6, Postal Address=2020 du fin fin3, State=Arizona3, Email=null, Role=Full-Time, usr_password= , Last Name=Doe3, Start Date=null, First Name=John3}]
[2012-09-07T15:12:44.518-04:00] [oim_server1] [NOTIFICATION] [IAM-5010000] [oracle.iam.reconciliation.impl] [tid: [ACTIVE].ExecuteThread: '3' for queue: 'weblogic.kernel.Default (self-tuning)'] [userId: oiminternal] [ecid: 11d1def534ea1be0:315618d7:139a11a9ada:-8000-0000000000000002,0] [APP: oim#11.1.1.3.0] Generic Information: Query for getTargetTableEntity=select RECON_USR_MANAGER_KEY,MLS_LOCALE_CODE,RECON_USR_END_DATE,RECON_ORG_NAME,RECON_ACT_KEY,RECON_USR_LOGIN,RECON_USR_TYPE,re_key,RECON_MANAGER_LOGIN,RECON_USR_EMAIL,RECON_DISPLAYNAME66894369,RECON_USR_EMP_TYPE,RECON_USR_PASSWORD,RECON_USR_START_DATE,RECON_CHGLOGATTR_IDXLST from RA_MLS_MOCKPSHCMGTC85 where EXISTS (select re_key from recon_events where rb_key=6 and recon_events.re_key=RA_MLS_MOCKPSHCMGTC85.re_key)
[2012-09-07T15:12:44.521-04:00] [oim_server1] [NOTIFICATION] [IAM-5010000] [oracle.iam.reconciliation.impl] [tid: [ACTIVE].ExecuteThread: '3' for queue: 'weblogic.kernel.Default (self-tuning)'] [userId: oiminternal] [ecid: 11d1def534ea1be0:315618d7:139a11a9ada:-8000-0000000000000002,0] [APP: oim#11.1.1.3.0] Generic Information: getTargetTableEntity ={}
[2012-09-07T15:12:44.523-04:00] [oim_server1] [NOTIFICATION] [IAM-5010000] [oracle.iam.reconciliation.impl] [tid: [ACTIVE].ExecuteThread: '3' for queue: 'weblogic.kernel.Default (self-tuning)'] [userId: oiminternal] [ecid: 11d1def534ea1be0:315618d7:139a11a9ada:-8000-0000000000000002,0] [APP: oim#11.1.1.3.0] Generic Information: Query for getTargetTableEntity=select usr_territory,usr_pwd_warn_date,usr_emp_no,usr_locale,usr_middle_name,usr_manually_locked,usr_update,usr_disabled,usr_date_format,usr_display_name,usr_timezone,usr_mobile,usr_locked,usr_ldap_organization,usr_pwd_reset_attempts_ctr,usr_currency,usr_end_date,usr_deprovisioned_date,usr_pager,usr_time_format,usr_created,usr_deprovisioning_date,usr_po_box,usr_color_contrast,usr_create,usr_full_name,usr_ldap_guid,usr_country,usr_accessibility_mode,usr_type,usr_change_pwd_at_next_logon,usr_pwd_expire_date,usr_pwd_cant_change,re_key,usr_email,usr_provisioned_date,usr_data_level,usr_common_name,usr_automatically_delete_on,usr_locked_on,usr_login_attempts_ctr,usr_last_name,usr_start_date,usr_first_name,usr_manager_key,usr_locality_name,usr_policy_update,usr_number_format,usr_street,usr_embedded_help,usr_pwd_expired,usr_dept_no,usr_hire_date,usr_createby,usr_pwd_warned,usr_home_postal_address,usr_telephone_number,usr_name_preferred_lang,usr_font_size,usr_updateby,usr_description,usr_home_phone,usr_ldap_organization_unit,usr_pwd_min_age_date,usr_fax,usr_postal_code,act_key,usr_key,usr_login,usr_title,usr_status,usr_gen_qualifier,usr_postal_address,usr_state,usr_pwd_never_expires,usr_initials,usr_pwd_must_change,usr_emp_type,usr_ldap_dn,usr_password,usr_pwd_generated,usr_language,usr_provisioning_date from RECON_USER_OLDSTATE where EXISTS (select re_key from recon_events where rb_key=6 and recon_events.re_key=RECON_USER_OLDSTATE.re_key)
[2012-09-07T15:12:44.527-04:00] [oim_server1] [NOTIFICATION] [IAM-5010000] [oracle.iam.reconciliation.impl] [tid: [ACTIVE].ExecuteThread: '3' for queue: 'weblogic.kernel.Default (self-tuning)'] [userId: oiminternal] [ecid: 11d1def534ea1be0:315618d7:139a11a9ada:-8000-0000000000000002,0] [APP: oim#11.1.1.3.0] Generic Information: getTargetTableEntity ={}
Since I can see no errors, I'm not really sure where to look to understand why users are not created. Any ideas?
Thanks,
--jtellierLooking at your log it is clear that, you are populating Xellerate Type=null. This is mandatory field and can't be null. However, when you create user using UI, the default value "End-Users" is being passed by default, because we have the corresponding field "Design Console" access check box at oim user profile.
Just map the constant value for trusted recon
Xellerate Type=End-Users
--nayan -
Insert data file name into table from sql loader
Hi All,
I have a requirement to insert the data file name dynamically into table using sql loader.
Example:
sqlldr userid=username/passwword@host_string control=test_ctl.ctl data=test_data.dat
test_ctl.ctl
LOAD DATA
FILED TERMINATED BY ','
INTO TABLE test
(empid number,
ename varchar2(20),
file_name varchar2(20) ---------- This should be the data file name which can be dynamic (coming from parameter)
test_data.dat
1,test
2,hello
3,world
4,end
Please help..
Thanks in advance.
Regards
Anujyou'll probably have to write your control file on the fly, using a .bat or .sh file
rem ===== file : test.bat ========
rem
rem ============== in pseudo speak =============
rem
rem
echo LOAD DATA > test.ctl
echo FILED TERMINATED BY ',' >> test.ctl
echo INTO TABLE test >> test.ctl
echo (empid number, >> test.ctl
echo ename varchar2(20), >> test.ctl
echo file_name constant %1% >> test.ctl
echo ) >> test.ctl
rem
rem
rem
sqlldr userid=username/passwword@host_string control=test.ctl data=test_data.dat
rem =============== end of file test.bat =======================
http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/ldr_field_list.htm#i1008664 -
Creating interactive swf files that contain flash files that don't play once loaded on server
I am creating an interactive swf file that contains a flash movie f4v file and when I preview the page in ID it plays, but once I export the InDesign file to swf, insert that on a html page using Dreamweaver (so I can send a link and not a file out) the movie doesn't play, just the first frame of the clip. I have uploaded the f4v file to our server and linked in the original ID file and it still doesn't play. Any suggestions or similar issues?
I have left the page up for quite a while to see if it is just taking longer to load that clip and that doesn't seem to be it. I am just stumped and honestly not knowledgable on how websites and servers work - just know how to get my documents up so I can send links.
-
Structure of the flat file that uses bapi_po_create1 ?
Hi People,
I am going to create a purchase order using bapi_po_create1 .... to upload the file from legacy to r3 , .what will the stucture of the flat file .......what wil be the key to diffrentiate diiferent purchase orders. ( for eg : IN vendor master ........vendor number will be the key to diffrentiate the records , as we all know the purchase order will be created only at the end of the transaction..so what will be the key to diffrentitate each po record )Hi Siva,
Check the Code below. You can refer the fields to prepare the input File .
*& Report YDM_PO_CREATE1 *
REPORT ydm_po_create1.
*-- Input File Declaration
TYPES: BEGIN OF ty_input_file,
column1 TYPE char50,
column2 TYPE char50,
column3 TYPE char50,
column4 TYPE char50,
column5 TYPE char50,
column6 TYPE char50,
column7 TYPE char50,
column8 TYPE char50,
column9 TYPE char50,
column10 TYPE char50,
column11 TYPE char50,
column12 TYPE char50,
column13 TYPE char50,
column14 TYPE char50,
column15 TYPE char50,
column16 TYPE char50,
column17 TYPE char50,
column18 TYPE char50,
END OF ty_input_file.
DATA: i_input_file TYPE STANDARD TABLE OF ty_input_file,
wa_input_file TYPE ty_input_file.
CONSTANTS: c_path TYPE char20 VALUE 'C:\',
c_mask TYPE char9 VALUE ',*.*,*.*.',
c_mode TYPE char1 VALUE 'O',
c_filetype TYPE char10 VALUE 'ASC',
c_x TYPE char01 VALUE 'X'.
PARAMETERS : p_fname LIKE rlgrap-filename.
AT SELECTION-SCREEN ON VALUE-REQUEST FOR p_fname.
*-- Browse Presentation Server
PERFORM f4_presentation_file.
START-OF-SELECTION..
*-- Read presentation server file
PERFORM f1003_upload_file.
IF NOT i_input_file[] IS INITIAL.
PERFORM split_data.
ENDIF.
*& Form f4_presentation_file
*& F4 Help for presentation server
FORM f4_presentation_file .
CALL FUNCTION 'WS_FILENAME_GET'
EXPORTING
def_path = c_path
mask = c_mask
mode = c_mode
title = text-001
IMPORTING
filename = p_fname
EXCEPTIONS
inv_winsys = 1
no_batch = 2
selection_cancel = 3
selection_error = 4
OTHERS = 5.
IF sy-subrc <> 0.
MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
ENDIF.
ENDFORM. " f4_presentation_file
*& Form f1003_upload_file
*& Upload File
FORM f1003_upload_file .
DATA: lcl_filename TYPE string.
lcl_filename = p_fname.
CALL FUNCTION 'GUI_UPLOAD'
EXPORTING
filename = lcl_filename
filetype = c_filetype
has_field_separator = c_x
TABLES
data_tab = i_input_file
EXCEPTIONS
file_open_error = 1
file_read_error = 2
no_batch = 3
gui_refuse_filetransfer = 4
invalid_type = 5
no_authority = 6
unknown_error = 7
bad_data_format = 8
header_not_allowed = 9
separator_not_allowed = 10
header_too_long = 11
unknown_dp_error = 12
access_denied = 13
dp_out_of_memory = 14
disk_full = 15
dp_timeout = 16
OTHERS = 17.
IF sy-subrc <> 0.
MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
EXIT.
ENDIF.
ENDFORM. " f1003_upload_file
*& Form split_data
Collect data for creating Purchase Order
FORM split_data .
DATA: i_poitem TYPE STANDARD TABLE OF bapimepoitem,
i_poitemx TYPE STANDARD TABLE OF bapimepoitemx,
i_poitem_sch TYPE STANDARD TABLE OF bapimeposchedule,
i_poitem_schx TYPE STANDARD TABLE OF bapimeposchedulx,
i_acct_*** TYPE STANDARD TABLE OF bapimepoaccount,
i_acct_assx TYPE STANDARD TABLE OF bapimepoaccountx,
i_services TYPE STANDARD TABLE OF bapiesllc ,
i_srvacc TYPE STANDARD TABLE OF bapiesklc,
i_return TYPE STANDARD TABLE OF bapiret2,
wa_header TYPE bapimepoheader,
wa_headerx TYPE bapimepoheaderx,
wa_poitem TYPE bapimepoitem,
wa_poitemx TYPE bapimepoitemx,
wa_poitem_sch TYPE bapimeposchedule,
wa_poitem_schx TYPE bapimeposchedulx,
wa_acct_*** TYPE bapimepoaccount,
wa_acct_assx TYPE bapimepoaccountx,
wa_services TYPE bapiesllc,
wa_srvacc TYPE bapiesklc,
wa_return TYPE bapiret2,
ws_po TYPE bapimepoheader-po_number.
wa_services-pckg_no = 10.
wa_services-line_no = 1.
wa_services-outl_no = '0'.
wa_services-outl_ind = c_x.
wa_services-subpckg_no = 20.
APPEND wa_services TO i_services.
wa_srvacc-pckg_no = 10.
wa_srvacc-line_no = 1.
wa_srvacc-serno_line = 01.
wa_srvacc-serial_no = 01.
wa_srvacc-percentage = 100.
APPEND wa_srvacc TO i_srvacc.
LOOP AT i_input_file INTO wa_input_file.
IF wa_input_file-column2 EQ 'HD'.
wa_header-doc_type = wa_input_file-column3.
wa_header-creat_date = sy-datum.
wa_header-created_by = sy-uname.
wa_header-vendor = wa_input_file-column4.
PERFORM conversion_output USING wa_header-vendor
CHANGING wa_header-vendor.
wa_header-comp_code = 'DE03'.
wa_header-purch_org = 'DE03'.
wa_header-pur_group = 'DE1'.
wa_header-vper_start = wa_input_file-column9.
wa_header-vper_end = wa_input_file-column10.
wa_headerx-comp_code = c_x.
wa_headerx-doc_type = c_x.
wa_headerx-creat_date = c_x.
wa_headerx-created_by = c_x.
wa_headerx-vendor = c_x.
wa_headerx-purch_org = c_x.
wa_headerx-pur_group = c_x.
wa_headerx-vper_start = c_x.
wa_headerx-vper_end = c_x.
ENDIF.
IF wa_input_file-column2 EQ 'IT'.
wa_poitem-po_item = wa_input_file-column3.
wa_poitem-short_text = wa_input_file-column6.
wa_poitem-plant = wa_input_file-column8.
wa_poitem-quantity = '1'.
wa_poitem-tax_code = 'V0'.
wa_poitem-item_cat = 'D'.
wa_poitem-acctasscat = 'K'.
wa_poitem-matl_group = wa_input_file-column7.
wa_poitem-pckg_no = '10'.
APPEND wa_poitem TO i_poitem .
wa_poitemx-po_item = wa_input_file-column3.
wa_poitemx-po_itemx = c_x.
wa_poitemx-short_text = c_x.
wa_poitemx-plant = c_x.
wa_poitemx-quantity = c_x.
wa_poitemx-tax_code = c_x.
wa_poitemx-item_cat = c_x.
wa_poitemx-acctasscat = c_x.
wa_poitemx-matl_group = c_x.
wa_poitemx-pckg_no = c_x.
APPEND wa_poitemx TO i_poitemx.
wa_poitem_sch-po_item = wa_input_file-column3.
wa_poitem_sch-delivery_date = sy-datum.
APPEND wa_poitem_sch TO i_poitem_sch.
wa_poitem_schx-po_item = wa_input_file-column3.
wa_poitem_schx-po_itemx = c_x.
wa_poitem_schx-delivery_date = c_x.
APPEND wa_poitem_schx TO i_poitem_schx.
wa_acct_***-po_item = 10.
wa_acct_***-serial_no = 01.
wa_acct_***-gl_account = '0006360100'.
wa_acct_***-co_area = '1000'.
wa_acct_***-costcenter = 'KC010000'.
APPEND wa_acct_*** TO i_acct_***.
wa_acct_***-po_item = 10.
wa_acct_***-serial_no = 02.
wa_acct_***-gl_account = '0006360100'.
wa_acct_***-co_area = '1000'.
wa_acct_***-costcenter = 'KC010000'.
APPEND wa_acct_*** TO i_acct_***.
wa_acct_assx-po_item = 10.
wa_acct_assx-serial_no = 01.
wa_acct_assx-po_itemx = c_x.
wa_acct_assx-serial_nox = c_x.
wa_acct_assx-gl_account = c_x.
wa_acct_assx-co_area = c_x.
wa_acct_assx-costcenter = c_x.
APPEND wa_acct_assx TO i_acct_assx.
wa_acct_assx-po_item = 10.
wa_acct_assx-serial_no = 02.
wa_acct_assx-po_itemx = c_x.
wa_acct_assx-serial_nox = c_x.
wa_acct_assx-gl_account = c_x.
wa_acct_assx-co_area = c_x.
wa_acct_assx-costcenter = c_x.
APPEND wa_acct_assx TO i_acct_assx.
wa_services-pckg_no = 20.
wa_services-line_no = 2.
wa_services-service = wa_input_file-column9.
wa_services-quantity = '100'.
wa_services-gr_price = '100'.
wa_services-userf1_txt = wa_input_file-column13.
APPEND wa_services TO i_services.
wa_srvacc-pckg_no = 20.
wa_srvacc-line_no = 1.
wa_srvacc-serno_line = 02.
wa_srvacc-serial_no = 02.
wa_srvacc-percentage = 100.
APPEND wa_srvacc TO i_srvacc.
ENDIF.
ENDLOOP.
CALL FUNCTION 'BAPI_PO_CREATE1'
EXPORTING
poheader = wa_header
poheaderx = wa_headerx
POADDRVENDOR =
TESTRUN =
MEMORY_UNCOMPLETE =
MEMORY_COMPLETE =
POEXPIMPHEADER =
POEXPIMPHEADERX =
VERSIONS =
NO_MESSAGING =
NO_MESSAGE_REQ =
NO_AUTHORITY =
NO_PRICE_FROM_PO =
IMPORTING
exppurchaseorder = ws_po
EXPHEADER =
EXPPOEXPIMPHEADER =
TABLES
return = i_return
poitem = i_poitem
poitemx = i_poitemx
POADDRDELIVERY =
poschedule = i_poitem_sch
poschedulex = i_poitem_schx
poaccount = i_acct_***
POACCOUNTPROFITSEGMENT =
poaccountx = i_acct_assx
POCONDHEADER =
POCONDHEADERX =
POCOND =
POCONDX =
POLIMITS =
POCONTRACTLIMITS =
poservices = i_services
posrvaccessvalues = i_srvacc
POSERVICESTEXT =
EXTENSIONIN =
EXTENSIONOUT =
POEXPIMPITEM =
POEXPIMPITEMX =
POTEXTHEADER =
POTEXTITEM =
ALLVERSIONS =
POPARTNER =
break gbpra8.
LOOP AT i_return INTO wa_return.
ENDLOOP.
ENDFORM. " split_data
*& Form conversion_output
Conversion exit input
FORM conversion_output USING p_ip
CHANGING p_op.
CALL FUNCTION 'CONVERSION_EXIT_ALPHA_INPUT'
EXPORTING
input = p_ip
IMPORTING
output = p_op.
ENDFORM. " conversion_output
Also suggest you to search in SDN with key - BAPI_PO_CREATE1. Will get more useful links.
Hope this helps.
Manish -
Conditional multiple loading by SQL Loader
Hi All,
I am trying to load data from flat files to multiple tables. However, the tables are related and I want that if a record is rejected while loading for one table, similar records should be rejected too.
For instance, I have two tables EMP_INFO and EMP_LEAVE in which I want to load data:
EmpID Fname Lname (EMP_INFO record)
34002 Rahul Agarwal
EmpID Casual Medical (EMP_LEAVE record)
34002 5 2
I want that if any record above fails, all records having 34002 EmpID should not be loaded.
However, I must mention that the table EMP_LEAVE does not contain records for all employees. This is the reason why I cannot arrange the flat file in a single-line structure and load in this way from CTL file:
INTO TABLE emp_info
(EmpID POSITION(1:10) CHAR,
Fname POSITION(12:22) CHAR)
INTO TABLE emp_leave
(EmpID POSITION(1:10) CHAR,
Casual POSITION(25:27) INTEGER EXTERNAL)
because for most employees leave columns would be blank. Only for those employees where leave is not blank, do I want the above conditional loading.
Any pointers much appreciated. Thanks.
Best,
Rahul.Hi Rahul,
this can be only a suggestion...
You have table A that is the master and B is the detail.
Your flat file contains records pertinent to table A and B
Data of parent objects are indicated before children objects.
Data in B table must be discarted if parent record in table A is not found.
Using those assumptions, You can define integrity constraint (FK) on the table B versus table A PK.
Then, raise the errors accepted in data loading, so if a parent record is not loaded or found, also child records are rejected.
Hope this helps
Max
Maybe you are looking for
-
Can i export a list of all the webpages in my site, enabled and disabled?
Hey Guys, Can i export a list of all the webpages in my site, enabled and disabled? I saw you could view all pages in the advanced view - but i am not sure if you can then export that as an excel file.
-
Issue with Apple TV not recognizing the home sharing option in itunes. Calls for paswrod but even when added it still does not recognize it. iTunes indicates that the home sharing optionis on but this is apparently not seen by Appletv. How can this b
-
How to have reports work in 9iAS 9.0.3 java edition
Hi, We are able top generate reports in Reports Builder and also run the reports in Jdev env( Embedded OC4J). We have to integrate the same with our J2EE application working on our production version of 9iAS 9.0.3 java edition. How to we start the re
-
Priniting Java output in Excel
Does anyone know if it is possible to tell a java application to output data to an Excel file as oppose to notepad? If it is possible could you let me know how. Thanks
-
IWeb sites viewed in Explorer/Netscape -- Oh NO!
Testing my site (http://web.mac.com/magads/iWeb ) in Safari ... looks fine. But ... Tried it in Explorer 5.2 ... Egad! ... Everything is aligned vertically along the left edge of the page, no type is formatted, no links work ... it's a horrible joke.