Column types on a flat file
I am using a flat file as a target and when I define column types as anything other then string I get the following error on execution
000 : null : com.sunopsis.jdbc.driver.file.a.i
com.sunopsis.jdbc.driver.file.a.i
at com.sunopsis.jdbc.driver.file.a.d.setDate(d.java)
at com.sunopsis.sql.SnpsQuery.updateExecStatement(SnpsQuery.java)
at com.sunopsis.sql.SnpsQuery.addBatch(SnpsQuery.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execCollOrders(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
hi lpantuso ,
suppose u want to transfer file data like
xxxx 1111 12-01-2005
yyyy 2222 12-03-2007
zzzz 3333 01-05-2007
to another blank file so you have do folloowing steps
craete file data server for individual data server suppose you r developing application on ur desk to so create single file dat server in topology manager with proper jdbc url and driver name as
Jdbc driver as : com.sunopsis.jdbc.driver.file.FileDriver choose from drop down menu
and add url as: jdbc:snps:dbfile
now for each different folder create physical schema with the schema and work schema name as the location of the folder .
click on context tab add the proper logical schema name
come to designer window
Create individual model for each physical schema (for each folder I can say)
Use the proper logical schema name (while you'll select technology as file all the logical schema related to the technology wil be displayed just select ur logical schema)
than create individual datastore for each file (you can create two data store at max for each file).
assuming you r using delimited file.
while reversing existing source file you will have all the columns of the file
change data type and size of the fields.
For the target File datastore file should be there wid blank structure so add column wid proper data type, for second field use numeric and for date use string as datatype .Size must be appropriate for the target file otherwise you'll have warning at designing time.
create an interface
use Global context with "Sunopsis memory engine" as staging area other than target.
then map the source and target file
at Flow tab
choose
LKM File to SQl for source area
NO LKM is there for Staging area
IKM SQL to file append for target (If these are not automatically updated)
Then try to execute,it's working
Note : If you want to transfer table data to a file then for date type data you have to check following things
1) at the time of mapping of date type from table to file interface use
to_char(<SourceTableNameAlias> . <columnName>) instead of
<SourceTableNameAlias> . <columnName>
and date type data field of the target file should be String type.............
Similar Messages
-
How to split column wise into separate flat files in ssis
IN SSIS...
1.---->I have a sales table country wise regions like (india, usa, srilanka) ....
india usa
srilanka
a b
c
d e
f
so i want output like in
flat file1.txt has india flat file2.txt has usa flat file3.txt has srilanka
a b
c
d e
f
2.----->i dont know how many regions in my table....dynamically split into separate flat files ....
pls help me.....thank uI think what you can do is this
1. Do a query based on UNPIVOT to get the data as rows instead of columns
For that you can use a query like this
IF OBJECT_ID('temp') IS NOT NULL DROP TABLE temp
CREATE TABLE temp
Country varchar(100),
Val decimal(25,5)
DECLARE @CountryList varchar(3000),@SQL varchar(max)
SELECT @CountryList = STUFF((SELECT ',[' + Column_Name + ']' FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME = '<SalesTableNameHere>' FOR XML PATH('')),1,1,'')
SET @SQL= 'SELECT * FROM <SalesTableNameHere> t UNPIVOT (Val FOR Country IN (' + @CountryList + '))p'
INSERT temp
EXEC (@SQL)
Once this is done you'll get data unpivoted to table
Then you can use a execute sql task with query like above
SELECT DISTINCT Country FROM Temp
Use resultset option as full resultset and store the result to an object variable
Then add a ForEach loop container with ADO enumerator and map to the object variable created above. Have variables inside loop to get inidvidual country values out.
Inside loop place a data flow task. Use a variable to store source query , make EvaluateAsExpression true for it and set Expression as below
"SELECT Val FROM Temp WHERE Country = " + @[User::LoopVariable]
Where LoopVariable is variable created inside loop for getting iterated values
Inside data flow task place a oledb source, choose option as SQL command from variable and map to the above query variable.
Link this to flat file destination create a flat file connection manager. Set a dynamic flat file connection using expression builder. Make it based on a variable and set variable to increment based on loop iteration
The core logic looks similar to this
http://visakhm.blogspot.ae/2013/09/exporting-sqlserver-data-to-multiple.html
dynamic file naming can be seen here
http://jahaines.blogspot.ae/2009/07/ssis-dynamically-naming-destination.html
Please Mark This As Answer if it solved your issue
Please Vote This As Helpful if it helps to solve your issue
Visakh
My Wiki User Page
My MSDN Page
My Personal Blog
My Facebook Page -
Reading a semi column delimited multi line Flat file on KM repository
Hi,
I have a requirement in our project to read a multi line, semi column delimited flat from kept on the Knowledge Management repository and display the contents in the Portal.
I had tried couple of options and was unable to read the file. I am not sure which are the correct APIs I should be using.
Can any one of the experts could guide me with the sample code, that will be great!
Your early response is highly appreciated.
Regards
Venkathere you go.
//******* Read file from KM
String repository_km;
String FileURL;
repository_km="/documents/data/data.txt";
try
//Getting the user......
IWDClientUser wdClientUser = WDClientUser.getCurrentUser();
IUser sapUser = wdClientUser.getSAPUser();
com.sapportals.portal.security.usermanagement.IUser ep5User =
WPUMFactory.getUserFactory().getEP5User(sapUser);
//Getting the Resource.........
IResourceContext resourseContext = new ResourceContext(ep5User);
IResourceFactory resourseFactory = ResourceFactory.getInstance();
//path to the KM Folder ("/documents/data/data.txt")
RID rid= RID.getRID(repository_km);
com.sapportals.wcm.repository.IResource resource =
resourseFactory.getResource(rid, resourseContext);
if (resource != null)
String text = "";
BufferedReader in = new BufferedReader(new InputStreamReader(resource.getContent().getInputStream()));
int count=0;
while ((text = in.readLine()) != null)
strText[count] =text;
count++;
catch (RuntimeException e2) {
wdComponentAPI.getMessageManager().reportException("Error in reading file from KM : "+e2.getMessage(),true);
// TODO Auto-generated catch block -
N IDOCS OF SAME TYPE TO A FLAT FILE
HI EXPERTS,
I have a scenario in which i have to receive IDOC and retrieve the value of a particular field.Giving the value in this field as input parameter to an RFC call get the number of IDOCs sent with this value(of the particular field).
Then i have to append all these similar type of IDOCs into a single file (within a paricular timestamp say 10 minutes).If all the IDOCs does not come in the particular timestamp A clarification case is to be raised.
Then this file is to be wrapped in a MIME structure using HTTPS and sent to the receiver.I need help in this regard.i worked out the solution with steffans blog
-
Flat File Import, Ignore Missing Columns?
The text files I'm importing always contain a fixed set of columns for example total number of full set of columns is 60, or a subset of those columns (some csv contain 40 columns, some contain 30 or 20 or any other number.) . I would like to import
these csv based on the column header inside the each csv, if it is a subset of full column set then the missing columns can be ignored with null value.
At the moment in SQL 2012, if I import a subset of columns in the csv file, the data doesn't import...I assume because the actual file doesn't include every column defined in the flat file source object?
Is it possible to accomplish this without dynamically selecting the columns, or using script component?
Thanks for help.
Sea CloudIf the columns coming are dynamic, then you might have to first get it into staging table with a single column reading entire row contents onto that single column and parse out the column information using string parsing logic as below
http://visakhm.blogspot.in/2010/02/parsing-delimited-string.html
This will help you to understand what columns are present based on which you can do the insertion to your table.
Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs -
Import flat file to multiple tables based on identifier column
Hello,
I am trying to setup a package that will import one pipe-delimited flat file (a utility bill) to multiple data tables based on the value of the first column. I have been told it is similar in format to an EDI file, but there are some differences.
The number of columns is consistent where the first columns are the same. Meaning a record that has '00' in the first column will always have 10 columns; a record that has '01' in the first column will always have 9 columns; etc.
Each value in the first column represents a separate destination data table. Meaning a record that has '00' in the first column should be output to table '00'; a record that has '01' in the first column should be output to table '01'; etc. All
destination tables reside on the same SQL Server.
Identifier columns can repeat multiple times throughout the flat file. Meaning a record that starts with '01' may be repeated multiple times in the same.
Sample Data:
00|XXXXXXXX|XXX|XXXXXXXX|XXXXXX|XXXX|X|XXXXXXXXXX|XX|XXXXX
01|XXXXXXXXXXX|XXX|XXXXXXXX|XXXXX|XXXXXXXXXXXXXXXXXXXX|XXXXXXXXXX|XXXXXXX|XXXXXXXXXXXXXX
02|XXXXXXXXXXX|XXXXXXXX|XXXXXXX|XXXXX|XXXXX|XXXXX|XXXXX|XXXXX|XXXXX|XXXXX|XXXXX
04|XXXXXXXXXXX|XXXXXXXXXXXXX|XXX|XXXXXXXX
05|XXXXXXXXXXX|XXXXXXXXXXXXX|XXX|XXXXXXXX|XXXX
07|XXXXXXXXXXXXX|X|XXXXXXXXXXXXXXX|XXX|XXXXXXXX|XXXX|XXXXXXX|XXXXXXXXXXX
07|XXXXXXXXXXXXX|X|XXXXXXXXXXXXXXX|XXX|XXXXXXXX|XXXX|XXXXXXX|XXXXXXXXXXX
07|XXXXXXXXXXXXX|X|XXXXXXXXXXXXXXX|XXX|XXXXXXXX|XXXX|XXXXXXX|XXXXXXXXXXX
07|XXXXXXXXXXXXX|X|XXXXXXXXXXXXXXX|XXX|XXXXXXXX|XXXX|XXXXXXX|XXXXXXXXXXX
01|XXXXXXXXXXX|XXX|XXXXXXXX|XXXXX|XXXXXXXXXXXXXXXXXXXX|XXXXXXXXXX|XXXXXXX|XXXXXXXXXXXXXX
02|XXXXXXXXXXX|XXXXXXXX|XXXXXXX|XXXXX|XXXXX|XXXXX|XXXXX|XXXXX|XXXXX|XXXXX|XXXXX
04|XXXXXXXXXXX|XXXXXXXXXXXXX|XXX|XXXXXXXX
Any help would be appreciated.Hi koldar.308,
If there are few distinct values in the first column, we can use Flat File Source connect to that flat file, then use Conditional Split Transformation to split the first column to multiples, and then load the data to multiple tables with OLE DB Destination
based on the outputs of Conditional Split.
After testing the issue in my environment, please refer to the following steps to achieve this requirement:
Drag a Flat File Source connect to that flat file with Flat File Connection Manager.
Drag a Conditional Split Transformation connects to the Flat File Source.
Double-click the Conditional Split Transformation, add several Output based on the first column values as below:
Drag same number OLE DB Destinations as the outputs of Conditional Split, connect to Conditional Split with one case output:
If there are any other questions, please feel free to ask.
Thanks,
Katherine Xiong
If you have any feedback on our support, please click
here.
Katherine Xiong
TechNet Community Support -
Copying TEXT column from flat file into SQL results in empty fields....
I'm copying TEXT column from SQL to Flat file (ragged right fixed width) - DT_TEXT. It copies fine and I''ve checked the output file. Now, while trying to copy the flat file to sql server in another system. I find all the fields are empty.
When I preview the source from flatfile, I see all the entries there. But it copies other fields but the field with DT_TEXT is empty.
This is when I preview
SQL Table output
Any help will be helpful!!!Hi, I'm not sure If I'm understanding what you're saying. The data got copied from SQL to Flat file. I've double checked the flat file and I see the DT_TEXT data there. The size of file also gives an indication that there's existence of TEXT Data
But, when I copy that data back to sql again (which I can also preview before the load), The DT_TEXT values goes missing. Same when I copy to Excel as CSV or as a Flat file. I don't see text data.
The TEXT data resides on the first output. But when I try to extract to other format from that output, it doesn't come out. -
Unable to upload delimited flat file (Datastore)
Dear All,
Drilling down the tutorial: ODI11g: Creating an ODI Project and Interface: Exporting a Flat File to a Flat File
facing quite a problem while reverse-engineering delimited flat file (Datastore ->Model). No matter which type of delimited flat file is uploaded (txt, csv,prn), the system stayes intact while pressing "Reverse Engineering" button. Changing to Fixed Flat File option reveals that the system uploads correctly only the first row - Header of the columns (second option from the top)..and ...of cource, all the rest of the rows were placed in the first column.
I was preparing delimited flat files using Excel 2007 and 2010, in both cases no success. I tryied to create Fixed Flat File with Excel....confused completely since I could not realise how doing that?!
Properties are below. Please, help!
About
Oracle Data Integrator 11g 11.1.1
Standalone Edition Version 11.1.1
Build ODI_11.1.1.6.0_GENERIC_111219.1055
Copyright (c) 1997, 2011 Oracle. All Rights Reserved.
IDE Version: 11.1.1.6.38.61.92
Product ID: oracle.odi
Product Version: 11.1.1.6.0.0.0
Version
Component Version
========= =======
Java(TM) Platform 1.6.0_21
Oracle IDE 11.1.1.6.0
Properties
Name Value
==== =====
awt.toolkit sun.awt.windows.WToolkit
class.load.environment oracle.ide.boot.IdeClassLoadEnvironment
class.load.log.level CONFIG
class.transfer delegate
eclipselink.xml.platform org.eclipse.persistence.platform.xml.jaxp.JAXPPlatform
file.encoding Cp1251
file.encoding.pkg sun.io
file.separator \
ice.browser.forcegc false
ice.pilots.html4.ignoreNonGenericFonts true
ice.pilots.html4.tileOptThreshold 0
ide.AssertCheckingDisabled true
ide.AssertTracingDisabled true
ide.bootstrap.start 14794412430839
ide.build ODI_11.1.1.6.0_GENERIC_111219.1055
ide.conf C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\odi\bin\odi.conf
ide.config_pathname C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\odi\bin\odi.conf
ide.debugbuild false
ide.devbuild false
ide.extension.search.path jdev/extensions
ide.firstrun true
ide.java.minversion 1.6.0_04
ide.launcherProcessId 2932
ide.main.class oracle.ide.boot.IdeLauncher
ide.patches.dir ide/lib/patches
ide.pref.dir C:\Users\Администратор\AppData\Roaming\odi
ide.pref.dir.base C:\Users\Администратор\AppData\Roaming
ide.product oracle.odi
ide.shell.enableFileTypeAssociation C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\odi64.exe
ide.splash.screen splash.gif
ide.startingArg0 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\odi64.exe
ide.startingcwd C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client
ide.user.dir C:\Users\Администратор\AppData\Roaming\odi
ide.user.dir.var IDE_USER_DIR
ide.work.dir C:\Users\Администратор\Documents\odi
ide.work.dir.base C:\Users\Администратор\Documents
ilog.propagatesPropertyEditors false
java.awt.graphicsenv sun.awt.Win32GraphicsEnvironment
java.awt.printerjob sun.awt.windows.WPrinterJob
java.class.path ..\..\ide\lib\ide-boot.jar;..\..\..\..\oracledi.sdk\lib\ojdl.jar;..\..\..\..\oracledi.sdk\lib\dms.jar;..\..\jdev\extensions\oracle.odi.navigator\lib\log4j-1.2.8.jar;..\..\jdev\extensions\oracle.odi.navigator\lib\odi_hfm.jar;..\..\jdev\extensions\oracle.odi.navigator\lib\odihapp_common.jar;..\..\jdev\extensions\oracle.odi.navigator\lib\ess_es_server.jar;..\..\jdev\extensions\oracle.odi.navigator\lib\ess_japi.jar;..\..\jdev\extensions\oracle.odi.navigator\lib\odihapp_essbase.jar;..\..\jdev\extensions\oracle.odi.navigator\lib\odihapp_planning.jar
java.class.version 50.0
java.endorsed.dirs C:\Java\jre\lib\endorsed
java.ext.dirs C:\Java\jre\lib\ext;C:\Windows\Sun\Java\lib\ext
java.home C:\Java\jre
java.io.tmpdir c:\Temp\
java.library.path C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client;.;C:\Windows\Sun\Java\bin;C:\Windows\system32;C:\Windows;C:\app\master\product\11.2.0\dbhome_1\bin;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\
java.naming.factory.initial oracle.javatools.jndi.LocalInitialContextFactory
java.protocol.handler.pkgs null|oracle.odi.ui.url
java.runtime.name Java(TM) SE Runtime Environment
java.runtime.version 1.6.0_21-b07
java.specification.name Java Platform API Specification
java.specification.vendor Sun Microsystems Inc.
java.specification.version 1.6
java.util.logging.config.class oracle.core.ojdl.logging.LoggingConfiguration
java.vendor Sun Microsystems Inc.
java.vendor.url http://java.sun.com/
java.vendor.url.bug http://java.sun.com/cgi-bin/bugreport.cgi
java.version 1.6.0_21
java.vm.info mixed mode
java.vm.name Java HotSpot(TM) 64-Bit Server VM
java.vm.specification.name Java Virtual Machine Specification
java.vm.specification.vendor Sun Microsystems Inc.
java.vm.specification.version 1.0
java.vm.vendor Sun Microsystems Inc.
java.vm.version 17.0-b17
line.separator \r\n
LOG_FILE studio.log
native.canonicalization false
ODI_ORACLE_HOME C:\oracle\product\11.1.1\Oracle_ODI_1\
oracle.core.ojdl.logging.config.file ODI-logging-config.xml
oracle.home C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client
oracle.odi.studio.ess false
oracle.odi.wls.template.generator.agentEarLocation C:\oracle\product\11.1.1\Oracle_ODI_1\setup\manual\oracledi-agent\oraclediagent.ear
oracle.security.jps.config ./jps-config.xml
oracle.translated.locales de,es,fr,it,ja,ko,pt_BR,zh_CN,zh_TW
oracle.xdkjava.compatibility.version 9.0.4
org.apache.commons.logging.Log org.apache.commons.logging.impl.Jdk14Logger
os.arch amd64
os.name Windows Server 2008 R2
os.version 6.1
path.separator ;
python.cachedir c:\Temp\cachedir
python.home C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\jdev\extensions\oracle.odi.navigator\scripting
python.packages.paths java.class.path,sun.boot.class.path,odi.class.path
reserved_filenames con,aux,prn,lpt1,lpt2,lpt3,lpt4,lpt5,lpt6,lpt7,lpt8,lpt9,com1,com2,com3,com4,com5,com6,com7,com8,com9,conin$,conout,conout$
sun.arch.data.model 64
sun.boot.class.path C:\Java\jre\lib\resources.jar;C:\Java\jre\lib\rt.jar;C:\Java\jre\lib\sunrsasign.jar;C:\Java\jre\lib\jsse.jar;C:\Java\jre\lib\jce.jar;C:\Java\jre\lib\charsets.jar;C:\Java\jre\classes;C:\Java\lib\tools.jar;C:\Java\lib\dt.jar
sun.boot.library.path C:\Java\jre\bin
sun.cpu.endian little
sun.cpu.isalist amd64
sun.desktop windows
sun.io.unicode.encoding UnicodeLittle
sun.java2d.noddraw true
sun.jnu.encoding Cp1251
sun.management.compiler HotSpot 64-Bit Server Compiler
sun.os.patch.level Service Pack 1
user.country RU
user.dir C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\odi\bin
user.home C:\Users\Администратор
user.language ru
user.name master
user.timezone Europe/Moscow
user.variant
windows.shell.font.languages en
External Components
Name Version Path
==== ======= ====
HspJS.jar 11.1.1.1.0 Build 137 Built with CIS Drop 27 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\HspJS.jar
HspJS.jar 11.1.1.1.0 Build 137 Built with CIS Drop 27 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\jdev\extensions\oracle.odi.navigator\lib\HspJS.jar
HspJS_11.1.2.0.jar 11.1.2.0.01.10 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\HspJS_11.1.2.0.jar
HspJS_11.1.2.0.jar 11.1.2.0.01.10 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\jdev\extensions\oracle.odi.navigator\lib\HspJS_11.1.2.0.jar
activation.jar 1.1 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\activation.jar
binding-2.0.2.jar 2.0.2 2008-01-18 10:01:08 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\jdev\extensions\oracle.odi.navigator\lib\binding-2.0.2.jar
coherence.jar 3.7.1.1 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\coherence.jar
commons-beanutils-1.7.0.jar 1.6 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\commons-beanutils-1.7.0.jar
commons-codec-1.3.jar 1.3 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\commons-codec-1.3.jar
commons-collections-3.2.jar 3.2 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\commons-collections-3.2.jar
commons-discovery-0.4.jar 0.4 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\commons-discovery-0.4.jar
commons-io-1.2.jar 1.2 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\commons-io-1.2.jar
commons-lang-2.2.jar 2.2 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\commons-lang-2.2.jar
commons-logging-1.1.1.jar 1.1.1 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\commons-logging-1.1.1.jar
commons-net-1.4.1.jar 1.4.1 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\commons-net-1.4.1.jar
commons-vfs-1.0.jar 1.0 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\commons-vfs-1.0.jar
cpld.jar Version 11.1.2.0.00, Build 589, March 25 2010 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\cpld.jar
cpld.jar Version 11.1.2.0.00, Build 589, March 25 2010 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\jdev\extensions\oracle.odi.navigator\lib\cpld.jar
dbswing.jar 011.000.044.108 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\jdev\extensions\oracle.odi.navigator\lib\dbswing.jar
dx.jar 007.001.175.112 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\jdev\extensions\oracle.odi.navigator\lib\dx.jar
eclipselink.jar 2.3.1.v20111018-r10243 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\eclipselink.jar
enterprise_data_quality.jar 11.1.1.6.0 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\enterprise_data_quality.jar
enterprise_data_quality.jar 11.1.1.6.0 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\jdev\extensions\oracle.odi.navigator\lib\enterprise_data_quality.jar
ess_es_server.jar 11.1.2.0.0.615 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\jdev\extensions\oracle.odi.navigator\lib\ess_es_server.jar
ess_es_server.jar 11.1.2.0.0.615 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\ess_es_server.jar
ess_japi.jar 11.1.2.0.0.615 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\ess_japi.jar
ess_japi.jar 11.1.2.0.0.615 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\jdev\extensions\oracle.odi.navigator\lib\ess_japi.jar
fmwgenerictoken.jar 1.0.0.0 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\jdev\extensions\oracle.odi.navigator\..\..\..\..\..\odi_misc\fmwgenerictoken.jar
forms-1.2.0.jar 1.2.0 2008-02-23 08:37:50 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\jdev\extensions\oracle.odi.navigator\lib\forms-1.2.0.jar
groovy-all-1.7.4.jar 1.7.4 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\jdev\extensions\oracle.odi.navigator\scripting\groovy-all-1.7.4.jar
hsqldb.jar 2.0.0 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\hsqldb.jar
http_client.jar December 9 2011 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\http_client.jar
javax.security.jacc_1.0.0.0_1-1.jar 1.1 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\javax.security.jacc_1.0.0.0_1-1.jar
mail.jar 1.4 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\mail.jar
odi-sap.jar 10.1.3.12 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\odi-sap.jar
odi_hfm.jar 11.1.1.6.0 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\odi_hfm.jar
odi_hfm.jar 11.1.1.6.0 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\jdev\extensions\oracle.odi.navigator\lib\odi_hfm.jar
odihapp_essbase.jar 11.1.1.6.0 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\jdev\extensions\oracle.odi.navigator\lib\odihapp_essbase.jar
odihapp_essbase.jar 11.1.1.6.0 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\odihapp_essbase.jar
odihapp_planning.jar 11.1.1.6.0 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\odihapp_planning.jar
odihapp_planning.jar 11.1.1.6.0 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\jdev\extensions\oracle.odi.navigator\lib\odihapp_planning.jar
odihapp_planning_11.1.2.0.jar 11.1.1.6.0 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\odihapp_planning_11.1.2.0.jar
odihapp_planning_11.1.2.0.jar 11.1.1.6.0 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\jdev\extensions\oracle.odi.navigator\lib\odihapp_planning_11.1.2.0.jar
ojdbc6dms.jar 11.2.0.3.0 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\jdev\extensions\oracle.odi.navigator\lib\ojdbc6dms.jar
oracle.ucp_11.1.0.jar 11.2.0.3.0 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\oracle.ucp_11.1.0.jar
pop3.jar 1.1.1 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\pop3.jar
spring-aop.jar 2.0.3 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\spring-aop.jar
spring-beans.jar 2.0.3 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\spring-beans.jar
spring-context.jar 2.0.3 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\spring-context.jar
spring-core.jar 2.0.3 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\spring-core.jar
spring-dao.jar 2.0.3 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\spring-dao.jar
spring-jdbc.jar 2.0.3 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\spring-jdbc.jar
spring-jmx.jar 2.0.3 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\spring-jmx.jar
spring-jpa.jar 2.0.3 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\spring-jpa.jar
spring-web.jar 2.0.3 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\spring-web.jar
trove.jar 2.1.0 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\trove.jar
wlthint3client.jar 10.3.5.0 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\wlthint3client.jar
woodstox.jar 3.2.1 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\jdev\extensions\oracle.odi.navigator\lib\woodstox.jar
Extensions
Name Identifier Version Status
==== ========== ======= ======
BM metamodel framework oracle.bm.meta 11.1.1.6.38.61.92 Loaded
Code Editor oracle.ide.ceditor 11.1.1.6.38.61.92 Loaded
Diagram Framework oracle.diagram 11.1.1.6.38.61.92 Loaded
Diagram Javadoc Extension oracle.diagram.javadoc 11.1.1.6.38.61.92 Loaded
Diagram Thumbnail oracle.diagram.thumbnail 11.1.1.6.38.61.92 Loaded
Extended IDE Platform oracle.javacore 11.1.1.6.38.61.92 Loaded
Groovy Support oracle.ide.groovy 11.1.1.4.37.58.91 Loaded
Help System oracle.ide.help 11.1.1.6.38.61.92 Loaded
Import/Export Support oracle.ide.importexport 11.1.1.6.38.61.92 Loaded
Index Migrator support oracle.ideimpl.indexing-migrator 11.1.1.6.38.61.92 Loaded
JViews Registration Addin oracle.diagram.registration 11.1.1.6.38.61.92 Loaded
Log Window oracle.ide.log 11.1.1.6.38.61.92 Loaded
Modeler Framework oracle.modeler 11.1.1.6.38.61.92 Loaded
Modeler Framework Common Layer oracle.modeler.common 11.1.1.6.38.61.92 Loaded
Navigator oracle.ide.navigator 11.1.1.6.38.61.92 Loaded
ODI Navigator oracle.odi.navigator 11.1.1.6.0.0.0 Loaded
Object Gallery oracle.ide.gallery 11.1.1.6.38.61.92 Loaded
Oracle Data Integrator oracle.odi 11.1.1.6.0.0.0 Loaded
Oracle IDE oracle.ide 11.1.1.6.38.61.92 Loaded
Peek oracle.ide.peek 11.1.1.6.38.61.92 Loaded
Persistent Storage oracle.ide.persistence 11.1.1.6.38.61.92 Loaded
Property Inspector oracle.ide.inspector 11.1.1.6.38.61.92 Loaded
Runner oracle.ide.runner 11.1.1.6.38.61.92 Loaded
Virtual File System oracle.ide.vfs 11.1.1.6.38.61.92 Loaded
Web Browser and Proxy oracle.ide.webbrowser 11.1.1.6.38.61.92 Loaded
audit oracle.ide.audit 11.1.1.6.38.61.92 Loaded
jdukshare oracle.bm.jdukshare 11.1.1.6.38.61.92 Loaded
mof-xmi oracle.mof.xmi 11.1.1.6.38.61.92 Loaded
oracle.ide.indexing oracle.ide.indexing 11.1.1.6.38.61.92 Loaded
palette2 oracle.ide.palette2 11.1.1.6.38.61.92 LoadedYou might want to look at the documentation related to Microsoft Excel Microsoft Excel - 11g Release 1 (11.1.1)
-
Data convertion while exporting data into flat files using export wizard in ssis
Hi ,
while exporting data to flat file through export wizard the source table is having NVARCHAR types.
could you please help me on how to do the data convertion while using the export wizard?
Thanks.Hi Avs sai,
By default, the columns in the destination flat file will be non-Unicode columns, e.g. the data type of the columns will be DT_STR. If you want to keep the original DT_WSTR data type of the input column when outputting to the destination file, you can check
the Unicode option on the “Choose a Destination” page of the SQL Server Import and Export Wizard. Then, on the “Configure Flat File Destination” page, you can click the Edit Mappings…“ button to check the data types. Please see the screenshot:
Regards,
Mike Yin
TechNet Community Support -
Error loading essbase data into flat files through DIM
Hi All,
While running the session in DIM 8.1.1 which will load data from Hyperion essbase (11.1.1.3) to flat file it fails.
From the log file it shows failes to load the HASReader plugins.
All the environmental variables are properly set.
Still I am facing this issue.
Could anymore suggest me what I am missing.
Thanks,Hi Avs sai,
By default, the columns in the destination flat file will be non-Unicode columns, e.g. the data type of the columns will be DT_STR. If you want to keep the original DT_WSTR data type of the input column when outputting to the destination file, you can check
the Unicode option on the “Choose a Destination” page of the SQL Server Import and Export Wizard. Then, on the “Configure Flat File Destination” page, you can click the Edit Mappings…“ button to check the data types. Please see the screenshot:
Regards,
Mike Yin
TechNet Community Support -
How to load a flat file into BW-BPS using Web Browser
Hello, i have a problem with the "How to do Paper". I want to upload a Excel CSV file , but the paper only describes a txt file Uplaod. Does anybody can help me ?Thanks !
You need to parse the line coming in from the flat file...
You can do this with generic types in your flat file structure (string).
Then you loop through the table of strings that is your flat file and parse the string so that it breaks up the line for each comma. There is an ABAP command called: SPLIT - syntax is as follows:
SPLIT dobj AT sep INTO
{ {result1 result2 ...} | {TABLE result_tab} }
[IN {BYTE|CHARACTER} MODE].
Regards,
Zane -
Error when exporting to flat file in ODI 11g
This works ok in ODI 10g. I'm using IKM SQL to File Append on Windows Server 2008 R2
Getting the following error when exporting to a flat file in ODI 11g: ODI-40406: Bytes are too big for array
I've seen a couple of threads like this on the forum, but I've never seen an answer to the problem.Problem is with the difference in behaviour for the IKM SQL to File Append KM between 10g and 11g.
Our 10g target file datastore had a mixture of fixed string and numeric columns. Mapping from the source to target was simple one to one column mapping. It generated the desired fixed format flat file; numerics were right adjusted with embedded decimal point and leading spaces. Each numeric column in the generated flat file occupied the exact space allocated to it in the layout. This gave the desired results, even though documentation for the 10g IKM states that all columns in the target must be string.
When we converted to 11g and tried to run this interface, it generated an error on the "numeric" columns because it was wrapping them in quotation marks. The result column was being treated as string, and it was larger than the defined target once it acquired those quotation marks.
In order to get 11g to work, I had to change all the numeric columns in the target data store to fixedstring 30. I then had to change the mapping for these numeric columns to convert them to right adjusted character strings (i.e. RIGHT(SPACE(30) + RTRIM(MyNumericColumn),30).
Now it works. -
Doubts in SAP(Idoc)-XI-Flat file scenario
Dear All,
I am sending Delivery Idoc from R/3 and I am able to view the Idoc in XI in transaction IDX5. Also i m able to see the XML structure with the data of the idoc in SXMB_MONI.
Now as per my scenario I have to download this to a flat file from XI.
My question is, should I have to do the creation of Data types, message types, mapping interface & interface mapping in IR or should I directly do the designing in the ID.
waiting for your reply.
Warm regards,
N.JainHi,
DT, MT, MI : you will create these things for receiver flat file only.
First create
In Integration Repository :
1. Import structure of IDOC in your scenario, In imported objects  IDOCs
a. In Interface Objects
1. Create Data type for your flat file
2. Create Message Type
3. Create Message Interface (inbound Asy.)
b. In Mapping Objects
1. Message Mapping : IDOC to Message Type of Flat File.
2. Create Interface Mapping : IDOC to Flat file by using Message Mapping which you have created.
Make activate all these things.
In Integration Directory :
1. Create Communication channel for File Receiver.
2. Sender Communication channel is not required.
3. Create Receiver Agreement.
4. Sender Agreement is not required.
5. Interface Determination
6. Receiver Determination
At sender side you will take your IDoc and at receiver side you will take your flat file.
It can help you,
Regards,
Sandeep Kaushik -
Flat File to InfoCube Load Issue
Hello All,
I am newbie in the SAP BW land. Probably a simple question to BW Experts.
This is crude implementation of the Fu and Fu example.
Environment SAP BW 7.0
Iam trying to load a flat file to the Info Cube.
1. I am able to successfully preview the data from the data sources
2. Also able to load the data to the PSA, which i see from PSA maintenance screen
Issue arises when i try to load the data to the Info Cube by running the DTP Process, it just does not load the data.
I ma unable to figure of what the issue is
Following the structure of the Flat File with 1 line of data
Day Unit of Measure Customer ID Meterial Number Sales Representative ID Sales Office Sales Region Price of Material Sales quantity Sales revenue Currency
20100510 gms 1122334455 92138 9710274 Chicago NorthEast 890 50 928 USD
Following is the structure of the Info Cube
Characteristics
Time
Unit
Customer ID
Material Number
Sales Representative ID
Sales Office
Sales Region
Measures
Price of Material
Sales quantity
Sales revenue
--- also where can i find a data flow diagram or steps of what needs to be done to design a standard cube loading process
Please tell me where am I going wrong.II think i made progress but i still do not see the data in the InfoCube
When i visit the PSA by
visiting the source system -> Manage PSA -> Select the line item -> PSA Maintenance-> I see the record completely like this
@5B@ 1 1 05/10/2010 GMS 1,122,334,455 92,138 9,710,274 CHICAGO NORTHEAST 890 50 928
But when i run the DTP Process i see the following error message
I think this is due to a data type mismatch between flat file and the infocube.
Please adivse
Error I see is
@5C@ No SID found for value 'GMS ' of characteristic 0UNIT BRAIN 70 @35@ @3R@
Also where can i click or extract the InfoCube structure in plain text so that i can post it on the forum for your better visibility -
Export SQL View to Flat File with UTF-8 Encoding
I've setup a package in SSIS to export a SQL view to a flat file and it's working fine. I now need to make that flat file UTF-8 encoded. The package executes but still shows the files as ANSI encoded.
My package consists of a Source (SQL View) -> Derived Column (casts the fields to DT_WSTR) -> Destination Flat File (Set to output UTF-8 file).
I don't get any errors to help me troubleshoot further. I'm running SQL Server 2005 SP2.Unless there is a Byte-Order-Marker (BOM - hex file prefix: EF BB BF) at the beginning of the file, and unless your data contains non-ASCII characters, I'm unsure there is a technical difference in the files, Paul.
That is, even if the file is "encoded" UTF-8, if your data is only ASCII values (decimal values 0-127, hex 00-7F), UTF-8 doesn't really serve a purpose over ANSI encoding. Now if you're looking for UTF-8 with specifically the BOM included, and your data is all standard ASCII, the Flat File Connection Manager can't do that, it seems.
What the flat file connection manager is doing correctly though, is encoding values that are over decimal 127/hex 7F in UTF-8 when the encoding of the connection manager is set to 65001 (UTF-8).
Example:
Input data built with a script component as a source (code at the bottom of this post) and with only one WSTR output column hooked to a flat file destination component:
a string containing only decimal value 225 (german Eszett character - ß)
Encoding set to ANSI 1252 looks like:
E1 0D 0A (which is the ANSI encoding of the decimal character value 225 (E1) and a CR-LF (0D 0A)
Encoding set to UTF-8 65001 looks like:
C3 A1 0D 0A (which is the UTF-8 encoding of the decimal character value 225 (C3 A1) and a CR-LF (0D 0A)
Note that for values over decimal 127, UTF-8 takes at least two bytes and up to four for the remaining values available.
So, I'm comfortable now, after sitting down and going through this, that the flat file connection manager is working correctly, unless you need a BOM.
1
Imports System
2
Imports System.Data
3
Imports System.Math
4
Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper
5
Imports Microsoft.SqlServer.Dts.Runtime.Wrapper
6
7
Public Class ScriptMain
8
Inherits UserComponent
9
10
Public Overrides Sub CreateNewOutputRows()
11
Output0Buffer.AddRow()
12
Output0Buffer.col1 = ChrW(225)
13
End Sub
14
15
End Class
Phil
Maybe you are looking for
-
Zen Micro Installation help...please
I just got a 5 gig Zen Micro. All was well until I attempted to install the program in the computer, which led to "Error code -600", which told me to clear my temp folder and check my internet connections. I have a cable connection, windows XP, and N
-
Unable to view Spare Part Components in SAP CRM Service Order
Hello All, We are implementing CRM 7.0 Ehp 2.0 As per the following video it is now possible to integrate the spare part components in the CRM Service Order, using the BOM replication and Product Proposal functionality in SAP CRM SAP CRM 7.0 - Integr
-
Unable to view two different image files at once in PS CS4
For some reason I am unable to open two separate image files and view them both on my screen in Photoshop CS 4. My goal is to combine two versions of the same raw file in order to expand the tonal range. I want to stack the 2 images and then use a
-
NEF file corruption in A3???
I have a NEF file taken on a D3 (fw: 2.02) that opens and displays fine in the latest View NX software (v 1.5.2) on Macintosh OS/X 10.6.2 ( Intel Duo Core 2) . After importing into A3, it "initially" displays fine. But after viewing the file and goin
-
Hi All I am doing configuration for an EBS implementation in development system. I have configured couple of search strings and 2 of them are not producing desired result. Please have a look. 1. BAI2 file has following information 16,699,12100,V,1407