Error when executing interface which load data from csv file which has 320
Hi,
Can some one provide a resolution for below error:
I have created an interface which load data from csv file which has 320 columns, to a Synonym which has 320 columns in it
using LKM File to SQL, IKM Sql Control Append.
I am getting below error when executing the interface :
com.sunopsis.tools.core.exception.SnpsSimpleMessageException: ODI-17517: Error during task interpretation. Task: 6 java.lang.Exception: BeanShell script error: Sourced file: inline evaluation of: ``out.print("The application script threw an exception: java.lang.StringIndexOutOf . . . '' Token Parsing Error: Lexical error at line 2, column 42. Encountered: "\\" (92), after : "": <at unknown location> BSF info: Create external table at line: 0 column: columnNo
at com.sunopsis.dwg.codeinterpretor.SnpCodeInterpretor.transform(SnpCodeInterpretor.java:485)
at com.sunopsis.dwg.dbobj.SnpSessStep.createTaskLogs(SnpSessStep.java:711)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:461)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2093)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:366)
at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:292)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:855)
at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)
at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
at java.lang.Thread.run(Thread.java:662)
Caused by: org.apache.bsf.BSFException: BeanShell script error: Sourced file: inline evaluation of: ``out.print("The application script threw an exception: java.lang.StringIndexOutOf . . . '' Token Parsing Error: Lexical error at line 2, column 42. Encountered: "\\" (92), after : "": <at unknown location>
BSF info: Create external table at line: 0 column: columnNo
at bsh.util.BeanShellBSFEngine.eval(Unknown Source)
at bsh.util.BeanShellBSFEngine.exec(Unknown Source)
at com.sunopsis.dwg.codeinterpretor.SnpCodeInterpretor.transform(SnpCodeInterpretor.java:471)
... 11 more
Text: The application script threw an exception: java.lang.StringIndexOutOfBoundsException: String index out of range: 2 BSF info: Create external table at line: 0 column: columnNo
out.print("createTblCmd = r\"\"\"\ncreate table ") ;
out.print(odiRef.getTable("L", "COLL_NAME", "W")) ;
out.print("<?=(extTabColFormat.getUseView())?\"_ET\":\"\"?>\n(\n\t") ;
out.print(odiRef.getColList("", "[CX_COL_NAME]\\t"+
"<?=extTabColFormat.getExtTabDataType(\\u0022[CX_COL_NAME]\\u0022,\\u0022[SOURCE_DT]\\u0022, \\u0022[DEST_WRI_DT]\\u0022, \\u0022[COL_FORMAT]\\u0022, \\u0022[BYTES]\\u0022, \\u0022[LONGC]\\u0022, \\u0022[SCALE]\\u0022)?>"
, ",\\n\\t", "","")) ;
out.print("\n)\nORGANIZATION EXTERNAL\n(\n\tTYPE ORACLE_LOADER\n\tDEFAULT DIRECTORY dat_dir\n\tACCESS PARAMETERS\n\t(\n\t\tRECORDS DELIMITED BY 0x'") ;
out.print(odiRef.getSrcTablesList("[XFILE_SEP_ROW]","")) ;
out.print("'\n\t\t") ;
out.print(odiRef.getUserExit("EXT_CHARACTERSET")) ;
out.print("\n\t\t") ;
out.print(odiRef.getUserExit("EXT_STRING_SIZE")) ;
out.print("\n\t\tBADFILE\t\t'") ;
out.print(odiRef.getSrcTablesList("", "[RES_NAME]", "", "")) ;
out.print("_%a.bad'\n\t\tLOGFILE\t\t'") ;
out.print(odiRef.getSrcTablesList("", "[RES_NAME]", "", "")) ;
out.print("_%a.log'\n\t\tDISCARDFILE\t'") ;
out.print(odiRef.getSrcTablesList("", "[RES_NAME]", "", "")) ;
out.print("_%a.dsc'\n\t\tSKIP \t\t") ;
out.print(odiRef.getSrcTablesList("", "[FILE_FIRST_ROW]", "", "")) ;
out.print("\n") ;
if (odiRef.getSrcTablesList("", "[FILE_FORMAT]", "", "").equals("F")) {out.print("\n\t\tFIELDS\n\t\t") ;
out.print(odiRef.getUserExit("EXT_MISSING_FIELD")) ;
out.print("\n\t\t(\n\t\t\t") ;
out.print(odiRef.getColList("", "[CX_COL_NAME]\\tPOSITION([FILE_POS]:[FILE_END_POS])\\t"+
"<?=extTabColFormat.getExtTabFormat(\\u0022[CX_COL_NAME]\\u0022,\\u0022[SOURCE_DT]\\u0022, \\u0022DEST_WRI_DT\\u0022, \\u0022[COL_FORMAT]\\u0022, \\u0022[BYTES]\\u0022, \\u0022[LONGC]\\u0022, \\u0022[SCALE]\\u0022)?>"
, ",\\n\\t\\t\\t", "","")) ;
out.print("\t\t\n\t\t)\n\t)\n") ;
} else {out.print("\n\t\tFIELDS TERMINATED BY x'") ;
out.print(odiRef.getSrcTablesList("", "[XFILE_SEP_FIELD]", "", "")) ;
out.print("'\n\t\t") ;
if(odiRef.getSrcTablesList("", "[FILE_ENC_FIELD]", "", "").equals("")){out.print("\n\t\t") ;
} else {out.print("OPTIONALLY ENCLOSED BY '") ;
out.print(odiRef.getSrcTablesList("", "[FILE_ENC_FIELD]", "", "").substring(0,1)) ;
out.print("' AND '") ;
out.print(odiRef.getSrcTablesList("", "[FILE_ENC_FIELD]", "", "").substring(1,2)) ;
out.print("' ") ;
}out.print("\n\t\t") ;
out.print(odiRef.getUserExit("EXT_MISSING_FIELD")) ;
out.print("\n\t\t(\n\t\t\t") ;
out.print(odiRef.getColList("", "[CX_COL_NAME]\\t"+
"<?=extTabColFormat.getExtTabFormat(\\u0022[CX_COL_NAME]\\u0022,\\u0022[SOURCE_DT]\\u0022, \\u0022DEST_WRI_DT\\u0022, \\u0022[COL_FORMAT]\\u0022, \\u0022[BYTES]\\u0022, \\u0022[LONGC]\\u0022, \\u0022[SCALE]\\u0022)?>"
, ",\\n\\t\\t\\t", "","")) ;
out.print("\t\t\n\t\t)\n\t)\n") ;
}out.print("\tLOCATION (") ;
out.print(odiRef.getSrcTablesList("", "'[RES_NAME]'", "", "")) ;
out.print(")\n)\n") ;
out.print(odiRef.getUserExit("EXT_PARALLEL")) ;
out.print("\nREJECT LIMIT ") ;
out.print(odiRef.getUserExit("EXT_REJECT_LIMIT")) ;
out.print("\n\"\"\"\n \n# Create the statement\nmyStmt = myCon.createStatement()\n \n# Execute the trigger creation\nmyStmt.execute(createTblCmd)\n \nmyStmt.close()\nmyStmt = None\n \n# Commit, just in case\nmyCon.commit()") ;
****** ORIGINAL TEXT ******
createTblCmd = r"""
create table <%=odiRef.getTable("L", "COLL_NAME", "W")%><?=(extTabColFormat.getUseView())?"_ET":""?>
<%=odiRef.getColList("", "[CX_COL_NAME]\t"+
"<?=extTabColFormat.getExtTabDataType(\u0022[CX_COL_NAME]\u0022,\u0022[SOURCE_DT]\u0022, \u0022[DEST_WRI_DT]\u0022, \u0022[COL_FORMAT]\u0022, \u0022[BYTES]\u0022, \u0022[LONGC]\u0022, \u0022[SCALE]\u0022)?>"
, ",\n\t", "","")%>
ORGANIZATION EXTERNAL
TYPE ORACLE_LOADER
DEFAULT DIRECTORY dat_dir
ACCESS PARAMETERS
RECORDS DELIMITED BY 0x'<%=odiRef.getSrcTablesList("[XFILE_SEP_ROW]","")%>'
<%=odiRef.getUserExit("EXT_CHARACTERSET")%>
<%=odiRef.getUserExit("EXT_STRING_SIZE")%>
BADFILE '<%=odiRef.getSrcTablesList("", "[RES_NAME]", "", "")%>_%a.bad'
LOGFILE '<%=odiRef.getSrcTablesList("", "[RES_NAME]", "", "")%>_%a.log'
DISCARDFILE '<%=odiRef.getSrcTablesList("", "[RES_NAME]", "", "")%>_%a.dsc'
SKIP <%=odiRef.getSrcTablesList("", "[FILE_FIRST_ROW]", "", "")%>
<% if (odiRef.getSrcTablesList("", "[FILE_FORMAT]", "", "").equals("F")) {%>
FIELDS
<%=odiRef.getUserExit("EXT_MISSING_FIELD")%>
<%=odiRef.getColList("", "[CX_COL_NAME]\tPOSITION([FILE_POS]:[FILE_END_POS])\t"+
"<?=extTabColFormat.getExtTabFormat(\u0022[CX_COL_NAME]\u0022,\u0022[SOURCE_DT]\u0022, \u0022DEST_WRI_DT\u0022, \u0022[COL_FORMAT]\u0022, \u0022[BYTES]\u0022, \u0022[LONGC]\u0022, \u0022[SCALE]\u0022)?>"
, ",\n\t\t\t", "","")%>
<%} else {%>
FIELDS TERMINATED BY x'<%=odiRef.getSrcTablesList("", "[XFILE_SEP_FIELD]", "", "")%>'
<% if(odiRef.getSrcTablesList("", "[FILE_ENC_FIELD]", "", "").equals("")){%>
<%} else {%>OPTIONALLY ENCLOSED BY '<%=odiRef.getSrcTablesList("", "[FILE_ENC_FIELD]", "", "").substring(0,1)%>' AND '<%=odiRef.getSrcTablesList("", "[FILE_ENC_FIELD]", "", "").substring(1,2)%>' <%}%>
<%=odiRef.getUserExit("EXT_MISSING_FIELD")%>
<%=odiRef.getColList("", "[CX_COL_NAME]\t"+
"<?=extTabColFormat.getExtTabFormat(\u0022[CX_COL_NAME]\u0022,\u0022[SOURCE_DT]\u0022, \u0022DEST_WRI_DT\u0022, \u0022[COL_FORMAT]\u0022, \u0022[BYTES]\u0022, \u0022[LONGC]\u0022, \u0022[SCALE]\u0022)?>"
, ",\n\t\t\t", "","")%>
<%}%> LOCATION (<%=odiRef.getSrcTablesList("", "'[RES_NAME]'", "", "")%>)
<%=odiRef.getUserExit("EXT_PARALLEL")%>
REJECT LIMIT <%=odiRef.getUserExit("EXT_REJECT_LIMIT")%>
# Create the statement
myStmt = myCon.createStatement()
# Execute the trigger creation
myStmt.execute(createTblCmd)
myStmt.close()
myStmt = None
# Commit, just in case
myCon.commit().
at com.sunopsis.dwg.dbobj.SnpSessStep.createTaskLogs(SnpSessStep.java:738)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:461)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2093)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:366)
at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:292)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:855)
at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)
at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
at java.lang.Thread.run(Thread.java:662)
The issue is encountered because the text delimiter used in the source file did not consist of a pair of delimiters.
Please see support Note [ID 1469977.1] for details.
Similar Messages
-
Loading data from .csv file into existing table
Hi,
I have taken a look at several threads which talk about loading data from .csv file into existing /new table. Also checked out Vikas's application regarding the same. I am trying to explain my requirement with an example.
I have a .csv file and I want the data to be loaded into an existing table. The timesheet table columns are -
timesheet_entry_id,time_worked,timesheet_date,project_key .
The csv columns are :
project,utilization,project_key,timesheet_category,employee,timesheet_date , hours_worked etc.
What I needed to know is that before the csv data is loaded into the timesheet table is there any way of validating the project key ( which is the primary key of the projects table) with the projects table . I need to perform similar validations with other columns like customer_id from customers table. Basically the loading should be done after validating if the data exists in the parent table. Has anyone done this kind of loading through the APEX utility-data load.Or is there another method of accomplishing the same.
Does Vikas's application do what the utility does ( i am assuming that the code being from 2005 the utility was not incorporated in APEX at that time). Any helpful advise is greatly appreciated.
Thanks,
AnjaliHi Anjali,
Take a look at these threads which might outline different ways to do it -
File Browse, File Upload
Loading CSV file using external table
Loading a CSV file into a table
you can create hidden items in the page to validate previous records before insert data.
Hope this helps,
M Tajuddin
http://tajuddin.whitepagesbd.com -
Loading Data From .CSV files
Hi Gurus,
I m new to BW
Can u please help me out solving the below problem
When i was trying to load data from a flat file (.CSV), which is like this
Cust no, Cust name, Cust address
cust001,coca cola, Inc., 17 north road, Hyderabad - 35
infoObjects are
ZS_CUST,ZS_CUSTNM,ZS_CUSTAD
I am getting the first three values, into my attribute table, those seperated by commas
i.e cust001, coca cola and Inc into three fields in the attribute table (and PSA).
Now my piont is with out replacing the seperator ',' to any thing else
(If we have millions of records in the file and assuming that it will be a time consuming process to change ',' to some other seperator) can we extract the complete data into the fields in attribute table respectively.
Thanks
SuriSurender,
Once you specify in BW "," as the field separator, it will be interpreted as such. Of course, "Coca Cola, Inc" will be interpreted and loaded as two separate fields.
"Data should be fixed in the source, whenever is possible". That's a phrase you'll hear a lot when dealing with BW, especially when loading data from flat files.
Good luck.
Regards,
Luis -
Loading data from .csv file into Oracle Table
Hi,
I have a requirement where I need to populate data from .csv file into oracle table.
Is there any mechanism so that i can follow the same?
Any help will be fruitful.
Thanks and regardsYou can use Sql Loader or External tables for your requirement
Missed Karthick's post ...alredy there :)
Edited by: Rajneesh Kumar on Dec 4, 2008 10:54 AM -
Error when running package to load data from InfoProvider
I encounter the following error when using the standard package to load data from a InfoProvider to BPC. I tried loading the data from both DSO and InfoCube, but I get the same error.
The log message where the error occurs is as follows:
Task name CLEAR CUBE DATA:
Could not perform write (INHERITED ERROR)
The "convert" step of the process chain completed successfully. This error is happening when the "clear cube according data" step of the process chain gets executed.
Any ideas on why this is happening?
Thanks.Hi Laeral:
You should be able to just open one VISA session then write and read both commands then close the session at the end. This sounds to me like an error that comes from trying to open two VISA sessions to the instrument at the same time. I have attached some very basic LabVIEW VISA code that writes and reads two different commands. Hopefully it will get you started.
Regards,
Emilie S.
Applications Engineer
National Instruments
Attachments:
Basic VISA Example Two Commands.vi 39 KB -
Transfer(IDocs and TRFC) error when use infopackage to load data from R/3
I create an infopackage to load data from R/3, when start immediately ,there is no data transfered. The return error is :
transfer(IDocs and TRFC): Errors occurred
request IDoc: Application document not posted
Does anybody know how can correct this error?
Thanks great.Hi Yimeng,
Check in SM58 transaction code in R/3 and see if any LUWs are stuck there. If so you can execute them manually and complete the data load.
regards,
Sumit -
How to regularly load data from .csv file to database (using apex)
Hi,
i am using apex3 , I need to load data from a csv file to apex . I need to perform this automatically through code at regular time interval of 5-10 seconds.
Is it possible .If yes how ?. Please reply as early as possible. This will decide whether to use apex or not for this application.
this is question for Application Express. Dont know why in forum for BPEL
Edited by: TEJU on Oct 24, 2008 2:57 PMHello,
You really need to load the data every 5-10 seconds? Presumably it's read only?
I would look at using an Oracle external table instead, that way you just drop your CSV in a location the DB can read it and then you build a table that essentially references the underlying CSV (that way when you query the table you view the data in the CSV file).
Take a look at this link for a quick example on usage -
http://www.oracle-base.com/articles/9i/SQLNewFeatures9i.php#ExternalTables
Hope this helps,
John.
Blog: http://jes.blogs.shellprompt.net
Work: http://www.apex-evangelists.com
Author of Pro Application Express: http://tinyurl.com/3gu7cd
REWARDS: Please remember to mark helpful or correct posts on the forum, not just for my answers but for everyone! -
Loading data from flat file to table
Hi,
I am using OWB 10g R2. I have to develop a map which loads data from flat file to table.
1. I have created a .csv file.
2. Import that into OWB by Module Wizard.
3. I have created one external table, which use the .csv file.
After that what I have to do for successfully loading the data into table.
Regards,
SanjoyHi Sanjoy
You can building a mapping to move the data from somewhere to somewhere else. More specifically;
1. create a mapping
2. add external table to mapping
3. add table to mapping for target (it can be unbound)
4. map from external table ingroup to target table ingroup (all columns will be mapped)
5. if target table was unbound right click on target table and click create and bind
This is a basic mapping you can have for this scenario. There is a post here which does something like this, but it will create a File to Table mapping which will use SQLLoader.
http://blogs.oracle.com/warehousebuilder/2009/01/expert_useful_scripts_from_simple_primitives.html
Cheers
David -
Problem in loading data from flat file to info object
Hi experts,
I am new on bw. I am trying to load data from .csv file to info object. CSV file contains two colums Mat No, Mat Name and are separated by comma and similarly Mat Name is attribute of Mat No in info object. I am doing all steps by following the book but whenever I load data into info object it is loaded in one column. I mean both Mat no and Mat name apears in Mat No column whereas Mat Name column appears empty in info object. I have changed exel file into .csv. Any one can help me sort it out.
regardsHi,
your flatfile should be in the form of csv ie it does not mean that in a word u seperate it with coma
create an excel file and say save as in teh file name give the name of your flatfile and in the save as type select CSV(comma delimited) from the drop down list
and then in this file enter the data as u fill in excel sheet.
After closing the file right click it and say open with wordpad then the data should appear seperated with a comma.
Try this and then revert back in case of any doubts.........
Are u loading data into BI 7.0 or 3.x
Regards
Madhavi -
Error
[Load data from excel file [1]] Error: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "Excel Connection Manager" failed with error code 0xC0202009. There
may be error message
I am using BIDS Microsoft Visual Studio 2008 and running the package to load the data from excel .
My machine has 32 bit excel hence have set property to RUN64BITRUNTIME AS FALSE.
But the error still occurs .
I checked on Google and many have used Delay validation property at Data flow task level to true but even using it at both excel connection manager and DFT level it doesnt work
MudassarThats my connection string
Provider=Microsoft.ACE.OLEDB.12.0;Data Source=E:\SrcData\Feeds\Utilization.xlsx;Extended Properties="Excel 12.0;HDR=NO";
Excel 2010 installed and its 32 bit edition
Are you referring to install this component -AccessDatabaseEngine_x64.exe?
http://www.microsoft.com/en-us/download/details.aspx?id=13255
Mudassar
You can try an OLEDB provider in that case
see
http://dataintegrity.wordpress.com/2009/10/16/xlsx/
you might need to download and install ms access redistributable
http://www.microsoft.com/en-in/download/details.aspx?id=13255
Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs -
" (quotes) is column values when loading data from csv to oracle table
I am loading a data from csv file to an oracle table.
After the data is loaded , lets suppose the value in one column is X.
when i write a query to fetch data like
select * from table where col='X' then it gives no output.
On investigating it was found that the value is stored in col as "X".
This also happens when i copy paste the value from column to some text editor ....
i want to remove these double quotes , and also want to know why these are coming...
Any suggestions guys ?
Thanks
Using Oracle 10gThese quotes are part of your data file. Most CSV-parsers remove those quotes.
If you are using an external table you can remove them by something like this for your access parameters
ORGANIZATION EXTERNAL
TYPE ORACLE_LOADER
ACCESS PARAMETERS
RECORDS DELIMITED BY NEWLINE
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
But if you want to remove them from your existing data use this
update your_table
set col1 = trim( '"' from col1 )
Note that in some CSV-files wich use an enclosing character the enclosing character is itself is used as an escape character to use the enclosing character inside a field.
For example a field might be "Some text with ""this"" enclosed"
To correct these fields you might use another update
update your_table
set col1 = replace( col1, '""', '"' ) -
ODI error while loading data from Flat File to oracle
Hi Gurus,
I am getting following error while loading flat file to oracle table :
ava.lang.NumberFormatException: 554020
at java.math.BigInteger.parseInt(Unknown Source)
at java.math.BigInteger.<init>(Unknown Source)
at java.math.BigDecimal.<init>(Unknown Source)
at com.sunopsis.sql.SnpsQuery.updateExecStatement(SnpsQuery.java)
at com.sunopsis.sql.SnpsQuery.addBatch(SnpsQuery.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execCollOrders(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSqlC.treatTaskTrt(SnpSessTaskSqlC.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
The connections between source and target is fine.
Kindly suggest on this error.
Thanks
ShridharHi John,
The source is csv file. By default all columns are string type.The integration is failing at step 3 ( load Data).
I am loading the data from csv file directly to staging table( oracle ).Its one to one mapping.
Shridhar -
Load data from XML files into BI
Hi All,
Can any one guide me through the steps which involve in loading data from XML files into BI?
Thanks
SantoshHi James,
I have followed upto step No: 12 .
Please could you let me know how to link the XML file on my local drive to BI,
I am not able to find figure out where to specify the path of the XML file to be loaded into BI.
Thanks In Advance
Regards,
San
1. Create a Infosource(ZIS), with transfer structure in accordance with the structure in the XML file. (You can open the XML file with MS Excel.This way you can get to know the structure).
2. Activate the transfer rules.
3. After activation ;from the Menu Bar , Select Extras>Create BW Datasource with SOAP connection.
4.Now Activate the Infosurce again, this creates an XML based Datasource(6AXXX...)
5.These steps would create two sub nodes under the Infosource(ZIS).
6.Select Myself system and create a data package and execute it run in Init mode(without Data Transfer).
7. This would create an entry in RSA7(BW delta Queue)
8. Again create another Delta Infopackage under it, and run it. Now the Datasource(6AXXXXXX..) would turn green in RSA7.
9.In Function builder(SE37) select your FM( do a search ,F4, on the datasource 6AXXX....) .
10.Inside this RFC based FM , from the Menu Bar select Utilities>more Utilities>Create Web services>From Function module.
11.A wizard will guide you through the next steps .
12.once this is done a Web serrvice would be enabled in WSADMIN. Select your FM and execute it.
13.From here you can upload the data from XML file into BW delta queue.
Edited by: Santosh on Nov 30, 2008 2:22 PM -
How to load data from UTL file to database
Hi All,
I am new in this technologies.
I am facing below problem to load data from utl file to database .
below is the script written by me :
CREATE OR REPLACE PROCEDURE load_data AS
v_line VARCHAR2(2000);
v_file UTL_FILE.FILE_TYPE;
v_dir VARCHAR2(250);
v_filename VARCHAR2(50);
v_1st_Comma NUMBER;
v_2nd_Comma NUMBER;
v_deptno NUMBER;
v_dname VARCHAR2(14);
v_loc VARCHAR2(13);
BEGIN
v_dir := ':f/rashi/dataload';
v_filename := 'fake.txt';
v_file := UTL_FILE.FOPEN(v_dir, v_filename, 'r');
LOOP
BEGIN
UTL_FILE.GET_LINE(v_file, v_line);
EXCEPTION
WHEN no_data_found THEN
exit;
END;
v_1st_Comma := INSTR(v_line, ',' ,1 , 1);
v_2nd_Comma := INSTR(v_line, ',' ,1 , 2);
v_deptno := SUBSTR(v_line, 1, v_1st_Comma-1);
v_dname := SUBSTR(v_line, v_1st_Comma+1, v_2nd_Comma-v_1st_Comma-1);
v_loc := SUBSTR(v_line, v_2nd_Comma+1);
DBMS_OUTPUT.PUT_LINE(v_deptno || ' - ' || v_dname || ' - ' || v_loc);
INSERT INTO don
VALUES (v_deptno, UPPER(v_dname), UPPER(v_loc));
END LOOP;
UTL_FILE.FCLOSE(v_file);
COMMIT;
END;
show error
I am getting the below errors:
LINE/COL ERROR
3/8 PL/SQL: Item ignored
3/8 PLS-00201: identifier 'UTL_FILE' must be declared
15/1 PL/SQL: Statement ignored
15/1 PLS-00320: the declaration of the type of this expression is
incomplete or malformed
20/1 PL/SQL: Statement ignored
20/19 PLS-00320: the declaration of the type of this expression is
incomplete or malformed
36/1 PL/SQL: Statement ignored
LINE/COL ERROR
36/17 PLS-00320: the declaration of the type of this expression is
incomplete or malformed
could anyone please advice me on above error.
thanx.First of all, is file located on (or is accessible from) database server? If not - you can't load it neither with UTL_FILE nor with external table. You need to use client side tool like SQL*Loader or write your own client side load. If file is accessible from database server, then, as sb92075 noted, external table (assuming you are not on some ancient Oracle version) is the way to go.
SY. -
Loading data from flat file to ODS
Hi Experts,
I have a developed a report, in which Flat file is the datasource and data target as DSO.
Here I have one requriment to Increatment the counter value in TVARVC table after every load compleate.
Can this be done thru Process chain? I belive this can be done thru process chain by creating variable.
If this can be done thru Process chains can any experts send me sample examples code.
Thanks In Advance.
Sreedhar.MHi sridhar,
We have a similar scenario in which we will load data from flat file to datasource the file name (x_20081010.dat) of such format being fetched from the low value of the TVARVC table ... once successfully executed we had program to increment the value of low which increments by one day adn the next day the file with that will be placed in the application server...
U can create a abap program varaint and attribute for that and u can achieve that.... this is sample code which searches for the string and after some coding i.e once ur file has been loaded successful code u can write similar code to increment or counter values which u are asking for...
search s_tvarvc-low for '20'.
if sy-subrc ne 0.
sy-msgty = 'E'.
sy-msgid = 'ZWM1'.
sy-msgno = '000'.
sy-msgv1 = 'Unable to find DATE to increment'.
raise unable_increment.
else.
w_date = s_tvarvc-low+sy-fdpos(08).
w_nextdate = w_date + 1.
replace w_date with w_nextdate into s_tvarvc-low.
modify tvarvc from s_tvarvc.
endif.
Regards
vamsi
Maybe you are looking for
-
I have two locations and want to have a wireless home phone at each location with the same home phone number. Only one terminal will be used at a time.
-
A friend of mine wanted me to make some DVD copies of a couple of his mini dv tapes. Each tape has 2 videos each taken on different days. Instead of using Magic Movie, I opted to burn using One Step. I burned 2 DVDs for each tape and didn't bother to
-
How to open url in acrobat reader using command line
Hi, I need to open file from url using command line, but I'm still getting some error (wrong filename). I've tried for example this command: AcroRd32.exe http://www.adobe.com/devnet/acrobat/pdfs/pdf_open_parameters.pdf Can anyone help me and tell me
-
How to check the date in the date format
hi all, i am using db10g i have to read the data column from some source. I should accept as it is and validate whether that date is in the date format how can i validate this condition? Thanks..
-
I cannot get the lockpref to work. RH6.3, Firefox 18. I created mozilla.cfg at: /usr/lib64/firefox It contains two lines: lockpref ("security.enable_ssl3", true); I created local-settings at: /usr/lib64/firefox/default/preferences It contains two lin