Flat File target
Hi I have Oracle table as source which has about 150 Columns and i have to load the data into flat file. Is there any way we can get the column names instead of
manually entering all the column names.
Thanks
Hi!
You can do this query for get the columns name and make a flat file with the "Append" option and later you can do your query for populate the flat file
select column_name from user_col_comments where table_name like 'YOUR_TABLE_NAME'
Similar Messages
-
Issue with characterset setting in OWB flat file target
an OWB mapping reads from the database source and writes to flat file target in unix os but junk characters are displayed for non english characters in the target file . The database table contains french,spanish,german,arabic characters.The nls db parameter setting is AL32UTF8. The same setting has been applied to OWB target also but still junk values are appearing for non english characterset.different charactersets like al32utf8,utf8,utf18,us7ascii have been tried at owb target setting but nothing is wroking out to remove junk characters. Please suggest
Edited by: 943807 on 30 Jun, 2012 10:43 PMPlease provide some input on the issue
-
Urgent.. Characterset of flat file target in OWB
OWB mapping reads from the database source and writes to flat file target in unix os but junk characters are displayed for non english characters in the target file . The database table contains french,spanish,german,arabic characters.The nls db parameter setting is AL32UTF8. The same setting has been applied to OWB target by editing flat file defintion also but still junk values are appearing for non english characterset.different charactersets like al32utf8,utf8,utf18,us7ascii have been tried at owb target setting but nothing is working out to remove junk characters. Please suggest
Yes this should be fine. Are you using the gateway to access SQLServer or code template mappings? You need to track which columns are throwing the data type conversion errors.
Cheers
David -
SQL Server Source and Flat File Target in OWB
Hello All,
I have a question, is it possible to have SQL Server as Source and Flat File as Target without using any intermediate Oracle table(s).
like
SQL Server --> ETL Operators --> Flat file?
as i m getting errors in data type conversions here. but if i replace flat file with Oracle table it works fine. (though i have used some conversion functions for data types as well in Flat file loading but it is giving me errors)
Thanks in advance.
TayyebYes this should be fine. Are you using the gateway to access SQLServer or code template mappings? You need to track which columns are throwing the data type conversion errors.
Cheers
David -
Dynamically changing flat file target location in OWB10gr2.
Hi,
We have a requirement where in we are suppose to load a flat file from relational tables and the location of the flat file needs to be configurable at runtime.
Please suggest if this is achievable.
Regards,
PHD
Edited by: user1662077 on Jun 17, 2009 10:08 PMSee the post here for dynamically generating target file names using the OWB file operator as a target;
http://blogs.oracle.com/warehousebuilder/2007/07/dynamically_generating_target.html
Other techniques also include using a table function as a target.
Cheers
David -
I'm trying to execute a SSIS package via SQL agent with a flat file source - however it fails with Code: 0xC001401E The file name "\server\share\path\file.txt" specified in the connection was not valid.
It appears that the problem is with the rights of the user that's running the package (it's a proxy account). If I use a higher-privelege account (domain admin) to run the package it completes successfully. But this is not a long-term solution, and I can't
see a reason why the user doesn't have rights to the file. The effective permissions of the file and parent folder both give the user full control. The user has full control over the share as well. The user can access the file (copy, etc) outside the SSIS
package.
Running the package manually via DTExec gives me the same error - I've tried 32 and 64bit versions with the same result. But running as a domain admin works correctly every time.
I feel like I've been beating my head against a brick wall on this one... Is there some sort of magic permissions, file or otherwise, that are required to use a flat file target in an SSIS package?Hi Rossco150,
I have tried to reproduce the issue in my test environment (Windows Server 2012 R2 + SQL Server 2008 R2), however, everything goes well with the permission settings as you mentioned. In my test, the permissions of the folders are set as follows:
\\ServerName\Temp --- Read
\\ServerName\Temp\Source --- No access
\\ServerName\Temp\Source\Flat Files --- Full control
I suspect that your permission settings on the folders are not absolutely as you said above. Could you double check the permission settings on each level of the folder hierarchy? In addition, check the “Execute as user” information from job history to make
sure the job was running in the proxy security context indeed. Which version of SSIS are you using? If possible, I suggest that you install the latest Service Pack for you SQL Server or even install the latest CU patch.
Regards,
Mike Yin
If you have any feedback on our support, please click
here
Mike Yin
TechNet Community Support -
Reverse engineering using Flat file
Hello,
Can someone help me please.
My environment,
I am new to ODI and trying to work with it.
My source DB = FLAT FILE
Target DB = MySQL5.1
I created a new project and models. I have the physical and logical architecture and all the connections are working. In the physical architecture, I specified the path of the csv file as c:\textfile.csv
I created a new model and a data store and pointed the the file to above directory and file.
When I click on reverse in column tab, it says
ERROR: The directory c:\textfile.csv specified in your schema, does not exist.
I am unable to figure out the problem. Can someone help me please.
Thank you.
Edited by: user9946607 on Mar 17, 2010 5:52 PMHi,
Thank you.
I was able to solve the error for the file loading, now i have used the lkm as file to sql and LKM sql control append, it says execution session started, but when i go to operator, it fails at Load and gives an error:
7000 : null : java.sql.SQLException: Could not read heading rows from file
java.sql.SQLException: Could not read heading rows from file
at com.sunopsis.jdbc.driver.file.x.<init>(x.java)
at com.sunopsis.jdbc.driver.file.a.d.d.a(d.java)
at com.sunopsis.jdbc.driver.file.g.a(g.java)
at com.sunopsis.jdbc.driver.file.w.executeQuery(w.java)
at com.sunopsis.sql.SnpsQuery.executeQuery(SnpsQuery.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execCollOrders(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSqlC.treatTaskTrt(SnpSessTaskSqlC.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
at com.sunopsis.dwg.cmd.e.i(e.java)
at com.sunopsis.dwg.cmd.g.y(g.java)
at com.sunopsis.dwg.cmd.e.run(e.java)
at java.lang.Thread.run(Unknown Source) -
Errors extracting from Essbase to flat file
New to ODI, I have the environment working - so I think
Using ODI 11.1.1.6.0 Studio, no web services, just the studio.
Essbase 9.3.3 is source
Target is flat file on C:\ drive of PC
PC is Windows 7, 32 bit
running an interface to extract from Essbase outline (metadata) to a flat file (csv) the integration step (4) "Extract Metadata" fails. The first 3 steps are sucessful (drop work table, create work table, begin essbase metadata extract).
The error messages in the 4th step are:
org.apache.bsf.BSFException: exception from Jython:
Traceback (most recent call last):
File "<string>", line 1, in <module>
at com.hyperion.odi.essbase.ODIStagingLoader.loadData(Unknown Source)
at com.hyperion.odi.essbase.AbstractEssbaseReader.extract(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
com.hyperion.odi.essbase.ODIEssbaseException: com.hyperion.odi.essbase.ODIEssbaseException: Loading the data into staging area failed.
at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:322)
at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2473)
at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:48)
at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2913)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2625)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:561)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:464)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2093)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:366)
at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:292)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:855)
at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)
at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:83)
at java.lang.Thread.run(Thread.java:662)
Caused by: Traceback (most recent call last):
File "<string>", line 1, in <module>
at com.hyperion.odi.essbase.ODIStagingLoader.loadData(Unknown Source)
at com.hyperion.odi.essbase.AbstractEssbaseReader.extract(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
com.hyperion.odi.essbase.ODIEssbaseException: com.hyperion.odi.essbase.ODIEssbaseException: Loading the data into staging area failed.
at org.python.core.PyException.fillInStackTrace(PyException.java:70)
at java.lang.Throwable.<init>(Throwable.java:181)
at java.lang.Exception.<init>(Exception.java:29)
at java.lang.RuntimeException.<init>(RuntimeException.java:32)
at org.python.core.PyException.<init>(PyException.java:46)
at org.python.core.PyException.<init>(PyException.java:43)
at org.python.core.Py.JavaError(Py.java:455)
at org.python.core.Py.JavaError(Py.java:448)
at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java:177)
at org.python.core.PyObject.__call__(PyObject.java:355)
at org.python.core.PyMethod.__call__(PyMethod.java:215)
at org.python.core.PyMethod.instancemethod___call__(PyMethod.java:221)
at org.python.core.PyMethod.__call__(PyMethod.java:206)
at org.python.core.PyObject.__call__(PyObject.java:381)
at org.python.core.PyObject.__call__(PyObject.java:385)
at org.python.pycode._pyx6.f$0(<string>:1)
at org.python.pycode._pyx6.call_function(<string>)
at org.python.core.PyTableCode.call(PyTableCode.java:165)
at org.python.core.PyCode.call(PyCode.java:18)
at org.python.core.Py.runCode(Py.java:1204)
at org.python.core.Py.exec(Py.java:1248)
at org.python.util.PythonInterpreter.exec(PythonInterpreter.java:172)
at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:144)
at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:322)
at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2472)
at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:47)
at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2913)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2625)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:558)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:464)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2093)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:366)
at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:292)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:855)
at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)
at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
... 1 more
Caused by: com.hyperion.odi.essbase.ODIEssbaseException: Loading the data into staging area failed.
at com.hyperion.odi.essbase.ODIStagingLoader.loadData(Unknown Source)
at com.hyperion.odi.essbase.AbstractEssbaseReader.extract(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java:175)
at org.python.core.PyObject.__call__(PyObject.java:355)
at org.python.core.PyMethod.__call__(PyMethod.java:215)
at org.python.core.PyMethod.instancemethod___call__(PyMethod.java:221)
at org.python.core.PyMethod.__call__(PyMethod.java:206)
at org.python.core.PyObject.__call__(PyObject.java:381)
at org.python.core.PyObject.__call__(PyObject.java:385)
at org.python.pycode._pyx6.f$0(<string>:1)
at org.python.pycode._pyx6.call_function(<string>)
at org.python.core.PyTableCode.call(PyTableCode.java:165)
at org.python.core.PyCode.call(PyCode.java:18)
at org.python.core.Py.runCode(Py.java:1204)
at org.python.core.Py.exec(Py.java:1249)
at org.python.util.PythonInterpreter.exec(PythonInterpreter.java:173)
at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:144)
at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:322)
at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2473)
at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:48)
at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2913)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2625)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:561)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:464)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2093)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:366)
at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:300)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:292)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:855)
at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)
at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:83)
... 1 more
Caused by: java.sql.SQLException: ODI-40400: Invalid format description
at com.sunopsis.jdbc.driver.file.impl.metadata.MetaDataExtractor.buildTableDescFromRequestSQLExceptionOnly(MetaDataExtractor.java:479)
at com.sunopsis.jdbc.driver.file.SQLManager.buildCommand(SQLManager.java:437)
at com.sunopsis.jdbc.driver.file.FileStatement.buildCommand(FileStatement.java:95)
at com.sunopsis.jdbc.driver.file.FileStatement.executeQuery(FileStatement.java:62)
... 40 more
The "Code" tab of the errored session step has "stats=essbaseAppReader.extract()" in the Target Code, nothing in the Source Code
In the integration, the "Staging area different from Target" is NOT checked (flat file target).
The log file shows the outline was read and the members retreived, the log for this execution show:
2012-06-14 10:36:49,873 INFO [SimpleAsyncTaskExecutor-5]: ODI Hyperion Essbase Adapter Version 9.3.1.1
2012-06-14 10:36:49,873 INFO [SimpleAsyncTaskExecutor-5]: Connecting to Essbase application [A_USRDFD] on [prodbpm1]:[1423] using username [admin].
2012-06-14 10:36:52,275 INFO [SimpleAsyncTaskExecutor-5]: Successfully connected to the Essbase application.
2012-06-14 10:36:52,275 INFO [SimpleAsyncTaskExecutor-5]: Essbase Metadata extract LKM option ALIAS_TABLE = null
2012-06-14 10:36:52,275 INFO [SimpleAsyncTaskExecutor-5]: Essbase Metadata extract LKM option MEMBER_FILTER_CRITERIA = IDESCENDANTS
2012-06-14 10:36:52,275 WARN [SimpleAsyncTaskExecutor-5]: Invalid value specified [ ] for member filter value. Using default value [TC Codes].
2012-06-14 10:36:52,275 INFO [SimpleAsyncTaskExecutor-5]: Essbase Metadata extract LKM option MEMBER_FILTER_VALUE = TC Codes
2012-06-14 10:36:52,275 INFO [SimpleAsyncTaskExecutor-5]: Essbase Metadata extract LKM option DIMENSION = TC Codes
2012-06-14 10:36:52,322 INFO [SimpleAsyncTaskExecutor-5]: Executing essbase member selection to fetch the metadata
2012-06-14 10:36:52,572 INFO [SimpleAsyncTaskExecutor-5]: Total number of metadata records fetched from essbase : [574].
2012-06-14 10:36:52,572 DEBUG [SimpleAsyncTaskExecutor-5]: Constructing iterator for the extracted data
2012-06-14 10:36:52,572 INFO [SimpleAsyncTaskExecutor-5]: Loading the staging area with the extracted records
2012-06-14 10:36:52,572 INFO [SimpleAsyncTaskExecutor-5]: Logging out and disconnecting from the essbase application.
2012-06-14 10:49:45,408 INFO [SimpleAsyncTaskExecutor-6]: ODI Hyperion Essbase Adapter Version 9.3.1.1
2012-06-14 10:49:45,408 INFO [SimpleAsyncTaskExecutor-6]: Connecting to Essbase application [A_USRDFD] on [prodbpm1]:[1423] using username [admin].
2012-06-14 10:49:47,748 INFO [SimpleAsyncTaskExecutor-6]: Successfully connected to the Essbase application.
2012-06-14 10:49:47,748 INFO [SimpleAsyncTaskExecutor-6]: Essbase Metadata extract LKM option ALIAS_TABLE = null
2012-06-14 10:49:47,748 INFO [SimpleAsyncTaskExecutor-6]: Essbase Metadata extract LKM option MEMBER_FILTER_CRITERIA = IDESCENDANTS
2012-06-14 10:49:47,748 WARN [SimpleAsyncTaskExecutor-6]: Invalid value specified [ ] for member filter value. Using default value [TC Codes].
2012-06-14 10:49:47,748 INFO [SimpleAsyncTaskExecutor-6]: Essbase Metadata extract LKM option MEMBER_FILTER_VALUE = TC Codes
2012-06-14 10:49:47,748 INFO [SimpleAsyncTaskExecutor-6]: Essbase Metadata extract LKM option DIMENSION = TC Codes
2012-06-14 10:49:47,795 INFO [SimpleAsyncTaskExecutor-6]: Executing essbase member selection to fetch the metadata
2012-06-14 10:49:47,951 INFO [SimpleAsyncTaskExecutor-6]: Total number of metadata records fetched from essbase : [574].
2012-06-14 10:49:47,951 DEBUG [SimpleAsyncTaskExecutor-6]: Constructing iterator for the extracted data
2012-06-14 10:49:47,966 INFO [SimpleAsyncTaskExecutor-6]: Loading the staging area with the extracted records
2012-06-14 10:49:47,966 INFO [SimpleAsyncTaskExecutor-6]: Logging out and disconnecting from the essbase application.
Only the file name being written to in the Target exists, it is updated with the column headings:
C1_SORTID,C2_PARENTNAME,C3_MEMBERNAME,C4_ALIAS,C5_DATASTORAGE,C6_TWOPASSCALC,C7_CONSOLIDATION,C8_UDA,C9_FORMULA,C10_COMMENT
There are no other files in the the Target flat file folder
I did have to add a primary key to the target datastore, added as PK_Sortid on the "SortID" column in order to get the flow to run. The LKM used is "LKM Hyperion Essbase METADATA to SQL.GLOBAL", the CKM is "CKM SQL.GLOBAL" ( the only one showing from the drop down).
I don't know what else to check or change.
thanksI think I saw somewhere in your post that Staging different from target is NOT checked, probably why you are getting lots of 'loading to staging area errors'. Your staging area has to be able to support relational operations i.e a relational database such as SQL Server or Oracle or the SUNOPSIS Memory Engine. In your interface make sure you check the staging differnet from target and for example select the SUNOPSIS Memory Engine as your staging area. Resave the interfece and re-run.
-
Load data from flat file to target table with scheduling in loader
Hi All,
I have requirement as follows.
I need to load the data to the target table on every Saturday. My source file consists of dataof several sates.For every week i have to load one particular state data to target table.
If first week I loaded AP data, then second week on Saturday karnatak, etc.
can u plz provide code also how can i schedule the data load with every saturday with different state column values automatically.
Thanks & Regards,
SekharThe best solution would be:
get the flat file to the Database server
define an External Table pointing to that flat file
insert into destination table(s) as select * from external_table
Loading only single state data each Saturday might mean troubles, but assuming there are valid reasons to do so, you could:
create a job with a p_state parameter executing insert into destination table(s) as select * from external_table where state = p_state
create a Scheduler chain where each member runs on next Saturday executing the same job with a different p_stateparameter
Managing Tables
Oracle Scheduler Concepts
Regards
Etbin -
Different formats of the flat file for the same target
In our deployment, we use plugin code to extract the csv files in the required format. The customers are on same version of datamart, but they are on different versions of source database - from 3.x to 4.5 depending on which version of application they are using. In 4.0, we introduced a new column email in the user table in the source database. Accordingly, plugin will add the field in the csv file. But not all the customers will get the upgraded version of plugin at the same time. So ETL code needs to decide which data flow to process depending on the format of the csv file to load data to the same target table. I made the email field in the target table nullable but it still expects the same format of the csv file with delimiter for null value.
Need help to achieve this. Can I read the structure of the flat file in DS or get the count of delimiters so that I can use a conditional to use different data flow based on the format of the flat files.
Can I make the email column in the flat file optional?
Thanks much in advance.You can add an email column that maps to null in a query transform for the source that does not contain this column.
Or else you can define two different file formats that map to the same file. One with the column and one without -
Flat File as Target - MultiRecord Type
How do I write from Oracle into a Flat File as Target where the data in Flat file is a multirecord type. For ex. Sales Order header and Sales Order Line info.
I tried to do this by using a Joiner to the sales order header, line and items table and then mapping outgrp of joiner to the Flat file data with two record types in it. But when I generate and run this it could only create header records in out put fileDear 396217,
As far as I am concerned you cannot really push out to a multi record structure without manual intervention.
There are a few things you can do: on the target (not sure whether that was already available in 9.0.3... but it definitely is in 9.0.4) you can set the flat file as a target to generate field names as headers. If that is what you want... then great.
The initial scenario: orders and order items in one model, is not supported. What you could do... is write a transformation to do this. Use calls to UTL_FILE to implement those and generate your own records. Primitive... but it works.
Alternatively, create 2 files and have a separate external process merge these. Not ideal; I realize that.
Mark. -
Multiple flat file in and multiple target tables
Hi,
How can we have multiple flat file into multiple targets.
I am trying to load data from multiple flat files into respective tables. But it gives error like
VLD-2411: Cannot handle two file structures
Make sure that only one file structure is used in a SQL*Loader map
Can anyone help.
Regards
Rakesh KumarI donot thing in one mapping you can take multiple sqlloader file.
If want to load data form multiple file use External table. -
URGENT How to map an aggregated result to Target flat-file?**
All,
I am trying to map the result of the following query from 'source1' to my target which is a flat file.
The query result set has 'one row per account' which is unique and record looks like :
12300,9675,2008,6,11250,$-2095.48
Questions>>>
1. what is the proper context ( at what level-- project, global?) should I create the $v_year, $v_period variables.?
I need the values for this 2 variables to be recognized at the end once I 'package' the mapping
so user can pass on the values at runtime.
2. Can this query somehow to be plugged as a 'filter' on top of the 'source1' in the interface? and result then mapped to the target?
what is the alternative if not? (dump it to another temp table? in SUNOPSIS memory engine?)
SELECT
PS_.product,
PS_.operating_unit,
$V_YEAR,
$V_PERIOD,
PS_.account,
SUM(PS_.posted_total_amt) TOT_AMT
FROM psoftdb.ps_ledger PS_
WHERE
((PS_.ledger='ACTUALS')
AND (PS_.accounting_period <=$V_PERIOD)
AND (PS_.fiscal_year=$V_YEAR)
AND ((SUBSTR(PS_.account,1,1)='1' OR
SUBSTR(PS_.account,1,1)='2' OR
SUBSTR(PS_.account,1,1)='3')
GROUP BY
PS_.product,
PS_.operating_unit,
PS_.accountIN order to do this you will need to declare two variables - at project level, with non-persistent type. You will need to have reverse engineered the source table into a data model. You should also define your flat file as a data model too. If this does not already exist, you will need to manually define it.
Create a package in which the first steps are to declare the variables.
then insert an interface which uses your source table as a source and your flat file as a target.
In the interface select "Staging area different from target" and select the schema of your source from the drop-down.
In your source, drag all the columns on which you want to filter (including accounting_period and fiscal_year) on to the grey background and fill in the filter details. Use the expression editor to set the filter, selecting the variable from the list, which ensures the correct syntax is used.
Fill in all the other mapping details.
On the flow tab select the IKM SQL to File Append as your IKM. This should allow you to create your output in one step.
Build a scenario from your package, and when you execute it pass in the variables:
e.g. startscen MYSCEN MYCONTEXT 001 "-MYPROJ.V_YEAR=1999" "-MYPROJ.V_PERIOD=07" -
Flat file as source and target in owb-error
hi,
I want to run an interface with source as flat file and target as rdbms table.
The mapping got deployed with warnings.
And when i run the mapping i get the below error.
ORA-04063: package body "Schema.project_name" has errors
ORA-06508: PL/SQL: could not find program unit being called: "schema.project_name"
ORA-06512: at line 1Hi
There's nothing wrong with your theory, however you might get a DB lock depending on the locking strategy of your RDBMS and it's current configuration.
Now, assuming that DB locking is your issue (get a DBA to check when the dataflow stalls), there are 2 solutions.
1, change the DB lock settings
2, take a copy of the target table in a new dataflow and use that as the source in the orginal dataflow.
Michael -
Flat file as targets in a mapping.
Hi All,
While reading on-line documentation in topic Define Mappings of oracle9i warehouse builder i came across following statement.
"A mapping can contain up to 50 flat files as targets, and it can also contain mix of flat files, relational objects and transformations."
Does this mean a flat file can be used as target in a mapping? I am in impression that only flat file can be used as source in a mapping.
Thanks in Advance.
Regards,
VidyanandHi Igor,
Some update on this, I managed to find steps to include flat file as target in a mapping.
Steps followed were as follows.
1> Created a File Module.
2> Imported a Text File with headings and no records.
3> Created a Warehouse Module.
4> Imported scott.dept table into the module.
5> Created a mapping, mapped scott.dept table to the text file.
6> Selected Configure option in the mapping. Set parameters Thefilename = the above file name and Access Specification).
7> Validated & Generated map, it looks fine with no errors.
8> Deployed & run the package.Deployment didn't show any errors however while running displays error ORA-01403: no data found.
Please note that the utl_file_dir parameter is set on oracle server and the utl_file_dir is mapped in windows NT explorer from machine.
Does anyone knows more about this error?
Thanks in Advance.
Regards,
Vidyanand
Maybe you are looking for
-
I had to factory reset my iPhone 4 because of memory constraints.
-
Plant information in Customer Statement (FBL5N)
Hi, I observe that plant field is availalbe in FBL5N. Hower the system has not populated the plant value in this field and I know that this is a Standard SAP functionality. I also understand that it is be possible to create a single billing document
-
Problem in FM - REUSE_ALV_GRID_DISPLAY ?
Hi All, I am using REUSE_ALV_GRID_DISPLAY to display a ALV grid. I need to DISABLE the following buttons on the Grid Toolbar; 1. Sort Ascending 2. Sort Descending 3. Filter 4. Total Please give me a complete example. Thanks, Kishan
-
Macbook air energy usage - battery runs down whilst sleeping
I have a July 2013 Macbook Air and the battery seems to be running down quickly (more so while it is sleeping, the usage when awake seems to go down at expected rate). I have taken it to apple genius bar who said the battery was fine and did a fresh
-
All my purchases aren't working correctly
I just spent the last hour and a half reading all the problems people are having with the latest iTunes update. I hope all your problems are solved. However, no one seems to be having the problem I am. I upgraded to 8.1.1 two days ago and now all of