POS flat file to ISRetail upload .. please look
Hi experts,
the requirement
Interface will retrieve the data from the LOGISOFT POS system. POS system will output all essential information to generate invoices in the SAP Retail system.
I need to get the flat file generated by Pos system into Is retail and post through IDOCS.
Can somebody let me know whether we have any standard program, Fm to get the flat file and post and also whic Idoc type and Fm i need to process this inbound functionality
regards,
naz
Edited by: N a z e e r on Dec 31, 2008 5:47 PM
u can use the idoc type wpuums for doing inbound of the idocs,and while sending the idocs to pos u can use the tcode wpma.
Similar Messages
-
Delete data from a flat file using PL/SQL -- Please help.. urgent
Hi All,
We are writing data to a flat file using Text_IO.Put_Line command. We want to delete data / record from that file after data is written onto it.
Please let us know if this is possible.
Thanks in advance.
VaishaliThere's nothing in UTL_FILE to do this, so your options are either to write a Java stored procedure to do it or to use whatever mechanism the host operating system supports for editing files.
Alternatively, you could write a PL/SQL procedure to read each line in from the file and then write them out to a second file, discarding the line you don't want along the way. -
RSEINB00 flat file to idoc uploading in XI
Hi,
I've searched a lot of posts here in SDN on how to use RSEINB00 to upload flat file idocs, and i tend to get lost as there are conflicting answers...
can anyone give me a clear/consolidated answer on how to use RSEINB00 in XI?
correct me if i'm wrong but the process is basically to upload a flat file idoc (residing in the XI application server) using RSEINB00, then after successful upload you're supposed to see your idoc in SXMB_MONI (the idoc flat file will be converted to idoc xml using the meta data loaded automatically in IDX2 and be passed to the xi pipeline)
is it needed to finish first all the setup in Integration Builder to enable the automatic loading of the metadata in IDX2?
Also what is the difference between executing RSEINB00 in a 4.5B to 4.5B system, and a 4.5B to XI system? (because rseinb00 is existing even before xi). is it the idoc xml convertion? if it is then definitely the importing of the idoc type in XI Integration Builder Design is necessary right?
Hope someone can explain clearly
I'll be more than willing to give points to the answer that will completely explain the concept..
Thanks in advanceHi HJ,
Report RSEINB00 is no dedicated XI Report. You will NOT see the IDOC in SXMB_MONI.
Report RSEINB00 just calls a function module:
IDOC_INBOUND_FROM_FILE
This is the documentation of IDOC_INBOUND_FROM_FILE unfortunately just in german language available:
+Der Funktionsbaustein führt die IDoc-Eingangsverarbeitung von
IDocs aus, die sich in der übergebenen Datei (Dataset) befinden.
Die IDocs werden aus der Ddatei ausgelesen und auf der IDoc-Datenbank
abgelegt. IDocs, die weiterverarbeitbar sind, werden asynchron an die Anwendung weitergegeben. Die zeitliche Entkopplung zwischen Auslesen und Abspeichern auf der einen Seite und der Verarbeitung durch die Anwendung auf der anderen Seite wird durch Erzeugen eines Workflow-Ereignisses bewerkstelligt. Der Ereignis-Container nimmt i.a. mehr als nur eine IDoc-Nummer auf. Die Aufgabe, die durch das Ereignis ausgelöst wird, reicht alle IDocs and die Anwendung weiter.
Ein COMMIT wird nicht nach Verarbeitung jedes einzelnen IDocs abgesetzt, sondern durch einen Datensatzzähler gesteuert, ausgelöst.+
Normally IDOCs are posted to a R/3 System and will processed automatically.
If you have a IDOC lying in the file system of your R/3 you can upload this file and trigger the automatic processing. Your R/3 knows what to do with every single IDOC type. IDOCs point to
a) function modules or
b) workflows
Hope that helps
Regards Mario -
Problem in upload flat file into bw-bps using web browser
Hi All,
I have follwed the steps as per the how to guide to upload flat file,its seems to be fine but when try to upload by running the URL its giving error "Value of variable Data Slice Global ( ZFIE0ALL ) cannot be determined " and warning "Errors occurred when executing planning function TUPLOAD(EXIT FOR UPLOAD DATA)/T0000000(WEB UPLOAD".
Since i am new to BW/BPS , also let me know, how test it , when i run file_upload URL directly its giving follwing error.
SAP Note
The following error text was processed in the system:
An exception with the type CX_SY_REF_IS_INITIAL occurred, but was neither handled locally, nor declared in a RAISING clause
When i run page1 url and if i file_load url, then its prompt me to see upload function, path for flat file. But when enter path of flat file and press upload function button ts giving error "Value of variable Data Slice Global ( ZFIE0ALL ) cannot be determined " and warning "Errors occurred when executing planning function TUPLOAD(EXIT FOR UPLOAD DATA)/T0000000(WEB UPLOAD".
FYI..i am using new design HTMLB option in how-to-doc.
Please help me to resolve this. Thanks in advance.
Also pls let me know can i debug this application?How ?
Vishal
Message was edited by:
vishal kashyap
Message was edited by:
vishal kashyapHi Vishal
Can u guide me , by what condition the data is selected in XTH_data ( Hashed table ) before the data is written by flat file function module .
Reply me ASAP ,
Thanks
Anup Singh -
Delta upload for flat file extraction
Hello everyone,
Is it possible to initialize delta update for a flat file extraction, If yes please explain how to do that??Hi Norton,
For a Flat file data source, the upload will be always FULL.
Please refer to following doc:
http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/a0b3f0e2-832d-2c10-1ca9-d909ca40b54e?QuickLink=index&overridelayout=true&43581033152710 if you need to to extract delta records from Flat file. We can write routine at infopackage level.
Regards,
Harish. -
Down Loading of data in Japanese Language from SAP R/3 30F to Flat file
Hi All,
We need to Extract the data (like MM ,SD, FI, CO Modulus for reporting purposes) which is in Japanese from the R/3 30F to Flat file and the Upload from flat file to BW 3.5
When we try to down load some sample records ,the Japanese Characters coming as boxes.
Can anyone Suggest how to Extract the Data in japanese Language into Flat file ,
and also are there any standard Extractors Available to extract all MM,FI, CO SD ..etc so that we can use them...
Please give some solution to above problem
Thanks&Regards
Ranga RaoHi,
I have no idea about this.
I have read some document after seeing your question.
may be helpful for you.
check the below link.
<a href="https://www.sdn.sap.comhttp://www.sdn.sap.comhttp://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/5411d290-0201-0010-b49b-a6145fb16d15">https://www.sdn.sap.comhttp://www.sdn.sap.comhttp://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/5411d290-0201-0010-b49b-a6145fb16d15</a>
Regards,
Vijay -
After working with flat files (especially text and CSV) for some time in my current role I have decided to come up with this little discussion which I believe may benefit peers who might not yet have come across some of the issues that I have raised in this
piece. This piece is just a summary of what I have thought would be useful hints for anyone to consider when they are working with CSV or text files. Everything in this writing is purely from my experience and I am discussing it at SSIS level. I believe most
of the actions can be applied across in SSMS when dealing with flat files there.
Flat files as destination file
Exporting data to flat file destination is relatively easy and straight forward. There aren’t as many issues experienced here as they are when one imports data from flat files. However, whatever one does here hugely impacts on how easy and straightforward
importing data down the line from your flat file that you create at this stage will be. There are a few things to note. When you open your Flat File Destination component, of the options that you will see – listed below is a discussion of some of them. I have
found it useful to take the actions which I have mentioned below.
Column names in the first row – If you want the column names to be in the first row on your file make sure that you always check the box with that option. SSIS does not do that for you by default.
Column delimiter – I always prefer to use Tab {t}. The default delimiter is Comma {,}. The problem that I have found with use of comma delimiters is that if the values have inherent commas within them the system does not always get it right
in determining where the column ends. As a result you may end up getting some rows with shifted and misplaced values from their original columns owing to the wrong column allocation by the system.
AlwaysCheckForRowDelimiters – this is a Flat File Connection Manager property. The default setting of this property is True. At one time I found myself having to use this after I faced a huge problem with breaking and misplacement of rows
in the dataset that I was dealing with. The problem emanated from the values in one of the columns. The offending column had values of varchar datatype which were presented in the form of paragraphs with all sorts of special characters within them, e.g.
This is ++, an example of what – I mean… the characters ;
in the dataset which gave me: nearly 100% ++ headaches – looked like {well}; this piece
OF: example??
You can see from the above italicised dummy value example what I mean. Values such as that make the system to prematurely break the rows. I don’t know why but the somehow painful experience that I had about this led me to the conclusion that I should not
leave the system to auto-decide where the row ends. As such, when I changed the property
AlwaysCheckForRowDelimiters from True to False, along with the recommendations mentioned in items 1 and 2 above, breaking and misplacement of rows was solved. By breaking I mean - you will find one row in a table being broken into two
or three separate rows in the flat file. This is carried over to the new table where that flat will is loaded.
Addendum
There is an additional option which I have found to work even better if one is experiencing issues with breaking of rows due to values being in the ‘paragraph’ format as illustrated above. The option for ‘paragraphed’ values which
I explained earlier works, but not always as I have realised. If that option does not work, this is what you are supposed to do
When you SELECT the data for export from your table, make your SELECT statement to look something like this
SELECT
ColumnName1,
ColumnName2,
ColumnName3,
REPLACE(REPLACE(OffendingColumnNameName,CHAR(10),''),CHAR(13),'')
AS OffendingColumnNameName,
ColumnName4
FROM
MyTableName
The REPLACE function gets rid of the breaks on your values. That is, it gets rid of the paragraphs in the values.
I would suggest use of double dagger column delimiters if using this approach.
Text or CSV file?? – In my experience going with the text file is always efficient. Besides, some of the things recommended above only work in text file (I suppose so. I stand to be corrected on this). An example of this is column delimiters.
Item 2 above recommends use of Tab {t} column delimiter whereas in CSV, as the name suggests, the delimiters are commas.
Flat files as source file
In my experience, many headaches of working with flat files are seen at importing data from flat files. A few examples of the headaches that I’m talking about are things such as,
Datatypes and datatype length, if using string
Shifting and misplacement of column values
Broken rows, with some pseudo-rows appearing in your import file
Double quotation marks in your values
Below I will address some of the common things which I have personally experienced and hope will be useful to other people. When you open your Flat File Source component, of the options that you will see – listed below is a discussion of some of them. I
have found it useful to take the actions which I have mentioned below.
Retain null values from the source as null values in the data flow – this option comes unchecked by default. From the time I noticed the importance of putting a check mark in it, I always make sure that I check it. It was after some of
my rows in the destination table were coming up with shifted and misplaced column values. By shifted and misplaced column values I mean certain values appearing under columns where you do not expect them, by so doing showing that purely the value has
been moved from its original column to another column where it does not belong.
Text qualifier – the default entry here is <none>. I have found that it is always handy to insert double quotes here (“). This will eliminate any double quotes which the system may have included at the time when the flat file was
created. This happens when the values in question have commas as part of the characters in them.
Column delimiter – this solely depends on the column delimiter which was specified at the time when the flat file was created. The system default is Comma {,}. Please note that if the delimiter specified here is different from the one in
your flat file the system will throw up an error with a message like “An error occurred while skipping data rows”.
Column names in the first data row – if you want the first row to be column names put a check mark on this option.
Datatypes and datatypes length
By default when you import a flat file your datatypes for all the columns come up as varchar (50) in SSIS. More often than not if you leave this default setup your package will fail when you run it. This is because some of the values in some of your columns
will be more than 50 characters, the default length. The resulting error will be a truncation error. I have found two ways of dealing with this.
Advanced – This is an option found on the Flat File Source Editor. Once this option is selected on your Flat File Source Editor you will be presented with a list of columns from your flat file. To determine your datatypes and length there
are two possible things that you can do at this stage.
Go column by column – going column by column you can manually input your desired datatypes and lengths on the Flat File Source Editor through the Advanced option.
Suggest types – this is another option under Advanced selection. What this option does is suggest datatypes and lengths for you based on the sample data amount that you mention in the pop-up dialog box. I have noticed that while this is
a handy functionality, the problem with it is that if some of the values from the non-sampled data have lengths bigger than what the system would have suggested the package will fail with a truncation error.
View code – this is viewing of XML code. If for example you want all your columns to be of 255 characters length in your landing staging table
Go to your package name, right click on it and select the option View code from the list presented to you. XML code will then come up.
Hit Ctrl + F to get a “Find and Replace:” window. On “Find What” type in
DTS:MaximumWidth="50" and on “Replace with:” type in
DTS:MaximumWidth="255". Make sure that under “Look in” the selection is
Current Document.
Click “Replace All” and all your default column lengths of 50 characters will be changed to 255 characters.
Once done, save the changes. Close the XML code page. Go to your package GUI designer. You will find that the Flat File Source component at this point will be highlighted with a yellow warning triangle. This is because the metadata definition has changed.
Double click the Flat File Source component and then click on Ok. The warning will disappear and you will be set to pull and load your data to your staging database with all columns being varchar (255). If you need to change any columns to specific data types
you can either use Data Conversion Component or Derived Column component for that purpose, OR you can use both components depending on the data types that you will converting to.
Dynamic Flat File Name and Date
Please see this blog
http://www.bidn.com/blogs/mikedavis/ssis/153/using-expression-ssis-to-save-a-file-with-file-name-and-date
There is so much to flat files to be discussed in one piece.
Any comments plus additions (and subtractions too) to this piece are welcome.
MpumeleloYou could create a
WIKI about this subject. Also see
http://social.msdn.microsoft.com/Forums/sqlserver/en-US/f46b14ab-59c4-46c0-b125-6534aa1133e9/ssis-guru-needed-apply-within?forum=sqlintegrationservices
Please mark the post as answered if it answers your question | My SSIS Blog:
http://microsoft-ssis.blogspot.com |
Twitter -
Flat Files: Use of Business Content or not?
Hello Experts,
I am wondering, with all the extra effort in the research to identify Cubes with preferred fields for a BW project, especailly if the source is a flat file, what actually are the advantages of going through this analysis to use the SAP delivered objects.
Thanks in advance.Hi Amanda,
If you are on a project in the functional area which is already covered by SAP, then it is preferrable to make an analysis of business content objects. Even if you are going to use flat files for data upload, predelivered infosources, ODSs, cubes, TRs and URs, queries may save you a lot of time.
Search help.sap.com for your functional area looking for infoobjects and providers. Try to evaluate if they are useful. Certainly, some knowledge in the func area is a great advantage.
Search business content in the func area. You may even simulate BC installation. For example, choose a cube, with data flow before and after and gather objects. The scope of these objects may give your some hints to evaluate.
If there is just one, yours, project on a BW instance, then you can use or even modify BC objects that you found. Master data loaded will not interfere with the other projects.
If the func area of your project is not covered by SAP, then, most likely, youll be able to use just a few BC objects, all the others youll have to create.
Sometimes, even if you are in the func area for which youll find predelivered objects, its more suitable to make copies of such objects and modify them deleting attributes that youll never use. Use of such standard infoobjects with many attributes may fource users to ask a lot of questions about these attributes (why, whats this, what for etc.). Moreover, if these attributes are not used, youll prepare your flat files with a lot of blank fields for the attributes. Consider also an extra DB space for them.
Doing the project almost from the scratch (with minimal usage of BC objects) will require very careful conceptual design of the data model.
Best regards,
Eugene -
Automation of Query generation and Conversion to Flat File CSV
I have a requirement for Generating / Executing the refresh for the existing Query IN BEX automatically on 3 rd Business day of Every Quarter in BEX Analyzer AND also to convert it to flat file after generation. Please give me some documents or exact steps on how to go about this
automation. Please reply me in detail and soon
Thanks
SoniyaHi Sir, I will be very grateful if u can clarify
1. User want the flat file on the Company Server, which means there are folder on my company BMW server , and he wants to see the flat file there so that he can review it
2. I saw some AL11 and directories how can I create my directory or Create the file in DIR_HOME and see the file in flat file format it shows like a screen which my user do not want, he just want a flat file.
3.Please suggest me if this method rscrm_bapi will work or should u suggest download scheduler or report agent and how.
4. Sorry for the question but my problem and requirement of user is not solved
please reply soon
thanks
soniya -
Creating JCo IDoc from flat file structure
Hi,
I need to send an IDoc into SAP using JCo.
The input to my program is a string containing lines representing a flat file idoc, e.g.
Line 1="EDI_DC40 2 ORDERS04.."
Line 2="E1EDK01 00000100000001 USD..."
Line 3="E1EDK14 0000030000000...."
Is there a simple way to use JCo to create & send the IDoc?
i.e.
1) If I use JCo and RFC IDOC_INBOUND_ASYNCHRONOUS, what would be all the steps/calls to SAP (create TID, call IDOC_INBOUND_ASYNCHRONOUS, confirm TID..?)
And can IDOC_INBOUND_ASYNCHRONOUS be called using the flat file structures (without having to map to all the JCo ParameterList fields)? Since the flat file structures are in the format required by the RFC, just in one long string.
Line 1=>IDOC_CONTROL_REC_40
Lines 2..n=>IDOC_DATA_REC_40
2) Similarly, if I were to use JCo plus the JCO IDoc library, is there a way to pass the flat file structures without having to do all the mapping to segment fields?
3) Other options..?
I want to use ALE to SAP, not files, even though the input is in the flat file structure.Your reply gives a link to the general JCo documentation.
It doesn't give ideas on how to call an RFC or IDoc from JCO without mapping each and every field from a flat file structure.
I'm looking for a way to do something like this:
Function IDOC_INBOUND_ASYNCHRONOUS has table parameters
IDOC_CONTROL_REC_40 STRUCTURE EDI_DC40
IDOC_DATA_REC_40 STRUCTURE EDI_DD40
Since I have the flat file representation of the IDoc, the first line should overlay exactly onto the EDI_DC40 structure. And the subsequent lines should overlay onto EDI_DD40. (all fields in this RFC are strings)
However JCO and JCO IDoc library seem very strongly typed, so it looks like I would have to map each field from the flat file structure to a field in the JCO Function or JCO IDoc object.
This could be done in a generic way using the function/idoc metadata, however there would still be some overhead.
Is there a way to get round this, and build the function/idoc treating its parameters as one long string? -
BI 7.0 Flat file extraction
Hi all....
For getting acquainted with BI 7.0 i am trying to extract the data from falt file...
1)every thing is fine i created DTP's at cube level and as well as Data source level
2)I have executed / triggred like in Infopacakage to schedule the dat
3) Request stautus is green in Infocube and anf it is also available for reporting..
4)When i checked monitor i could see over all status green...and i could find no data in the info cube.
i ahve done transformations and DTP's correctly .then why i cpould not see the data in infocube and why i could see all requests set to green and also available for reporting
Thnx in advance
Reg
Ramhi..
follow these steps for easy data load from flat file..
Uploading of master data
Log on to your SAP
Transaction code RSA1LEAD YOU TO MODELLING
1. Creation of Info Objects
In left panel select info object
Create info area
Create info object catalog ( characteristics & Key figures ) by right clicking the created info area
Create new characteristics and key figures under respective catalogs according to the project requirement
Create required info objects and Activate.
2. Creation of Data Source
In the left panel select data sources
Create application component(AC)
Right click AC and create datasource
Specify data source name, source system, and data type ( master data attributes, text, hierarchies)
In general tab give short, medium, and long description.
In extraction tab specify file path, header rows to be ignored, data format(csv) and data separator( , )
In proposal tab load example data and verify it.
In field tab you can you can give the technical name of info objects in the template and you not have to map during the transformation the server will automatically map accordingly. If you are not mapping in this field tab you have to manually map during the transformation in Info providers.
Activate data source and read preview data under preview tab.
Create info package by right clicking data source and in schedule tab click star to load data to PSA.( make sure to close the flat file during loading )
3. Creation of data targets
In left panel select info provider
Select created info area and right click to select Insert Characteristics as info provider
Select required info object ( Ex : Employee ID)
Under that info object select attributes
Right click on attributes and select create transformation.
In source of transformation , select object type( data source) and specify its name and source system Note: Source system will be a temporary folder or package into which data is getting stored
Activate created transformation
Create Data transfer process (DTP) by right clicking the master data attributes
In extraction tab specify extraction mode ( full)
In update tab specify error handling ( request green)
Activate DTP and in execute tab click execute button to load data in data targets.
4. Monitor
Right Click data targets and select manage and in contents tab select contents to view the loaded data. Alternatively monitor icon can be used.
BW 7.0
Uploading of Transaction data
Log on to your SAP
Transaction code RSA1LEAD YOU TO MODELLING
5. Creation of Info Objects
In left panel select info object
Create info area
Create info object catalog ( characteristics & Key figures ) by right clicking the created info area
Create new characteristics and key figures under respective catalogs according to the project requirement
Create required info objects and Activate.
6. Creation of Data Source
In the left panel select data sources
Create application component(AC)
Right click AC and create datasource
Specify data source name, source system, and data type ( Transaction data )
In general tab give short, medium, and long description.
In extraction tab specify file path, header rows to be ignored, data format(csv) and data separator( , )
In proposal tab load example data and verify it.
In field tab you can you can give the technical name of info objects in the template and you not have to map during the transformation the server will automatically map accordingly. If you are not mapping in this field tab you have to manually map during the transformation in Info providers.
Activate data source and read preview data under preview tab.
Create info package by right clicking data source and in schedule tab click star to load data to PSA.( make sure to close the flat file during loading )
7. Creation of data targets
In left panel select info provider
Select created info area and right click to create ODS( Data store object ) or Cube.
Specify name fro the ODS or cube and click create
From the template window select the required characteristics and key figures and drag and drop it into the DATA FIELD and KEY FIELDS
Click Activate.
Right click on ODS or Cube and select create transformation.
In source of transformation , select object type( data source) and specify its name and source system Note: Source system will be a temporary folder or package into which data is getting stored
Activate created transformation
Create Data transfer process (DTP) by right clicking the master data attributes
In extraction tab specify extraction mode ( full)
In update tab specify error handling ( request green)
Activate DTP and in execute tab click execute button to load data in data targets.
8. Monitor
Right Click data targets and select manage and in contents tab select contents to view the loaded data. There are two tables in ODS new table and active table to load data from new table to active table you have to activate after selecting the loaded data . Alternatively monitor icon can be used.
hope it helps...
all the best.. -
Can i able to put filter for my source flat file?
Hi all,
Please help me with the best practise of ODI.
My source is flat file and i want to put filter.
can i able to put filter for my source flat file? If yes, please help me with the best practise of applying filter.
Regards
SureshHi ,
If you are trying to create at the Model -->Datastore ---> Filter ---> Insert condition
then it will not work for File technology . You will get " Invalid Format Description "
But you can specify a filter in the interface .
Just drop the column(s) from your flat file data store into the canvas and then specify the filter condition .
Thanks,
Sutirtha -
LSMW - How to view the flat file on App Server
Hi All,
I'm trying to take a look at BC420_DOC_1_HEAD_POS.LEG which is the file for LSMW training BC420. However, this file is stored on the application (NT) server to which I have no access. Can I browse this file using R/3 utilities?
I just want to see what a flat file for the training course looks like.
Thanks so much!
RomanHi Roman,
In general, the users will not have access to the directories on the application server directly at the OS level. You will have to go through an SAP program / transaction.
Look at the transaction AL11. If the file that you are talking about resided in any of the directories listed in there you will be able to navigate to it.
Regards,
Anand Mandalika. -
Beginner - Flat file upload error Help Please
Hi Everyone,
I am trying to upload a flat file [.csv] for the first time.
I created Info objects Material Id, Material Name , Price as keyfigure.
Material Id holds Material Name and Price as attributes and is With Text and With Master Data.
For the Keyfigure, Price, I have selected Amount and defined as USD.
In the flat file, i have Material id, Material Name, Price, Description for Text, its something like this
Mid M-Name Price DESCRIPTION
10001 A 99.99 AA
When I create an Application Component and thereafter create a data source for master data attributes for Material
and select the flat file and upload it, then the data types for Keyfigure PRICE changes to FLTP Length 16, its no longer as DEC and i get some werid data for Price field, something like
"99.990000000001E+02" instead of 99.99. Why am I having this problem and how could this be resolved ?
I thought i would start off with my best foot forward but it didnt work. Any kind of help would be highly appreciated. Thanks guys..Byeehi,
these are the BI.7 steps
BW 7.0
Uploading of master data
Log on to your SAP
Transaction code RSA1u2014LEAD YOU TO MODELLING
1. Creation of Info Objects
u2022 In left panel select info object
u2022 Create info area
u2022 Create info object catalog ( characteristics & Key figures ) by right clicking the created info area
u2022 Create new characteristics and key figures under respective catalogs according to the project requirement
u2022 Create required info objects and Activate.
2. Creation of Data Source
u2022 In the left panel select data sources
u2022 Create application component(AC)
u2022 Right click AC and create datasource
u2022 Specify data source name, source system, and data type ( master data attributes, text, hierarchies)
u2022 In general tab give short, medium, and long description.
u2022 In extraction tab specify file path, header rows to be ignored, data format(csv) and data separator( , )
u2022 In proposal tab load example data and verify it.
u2022 In field tab you can you can give the technical name of info objects in the template and you not have to map during the transformation the server will automatically map accordingly. If you are not mapping in this field tab you have to manually map during the transformation in Info providers.
u2022 Activate data source and read preview data under preview tab.
u2022 Create info package by right clicking data source and in schedule tab click star to load data to PSA.( make sure to close the flat file during loading )
3. Creation of data targets
u2022 In left panel select info provider
u2022 Select created info area and right click to select Insert Characteristics as info provider
u2022 Select required info object ( Ex : Employee ID)
u2022 Under that info object select attributes
u2022 Right click on attributes and select create transformation.
u2022 In source of transformation , select object type( data source) and specify its name and source system Note: Source system will be a temporary folder or package into which data is getting stored
u2022 Activate created transformation
u2022 Create Data transfer process (DTP) by right clicking the master data attributes
u2022 In extraction tab specify extraction mode ( full)
u2022 In update tab specify error handling ( request green)
u2022 Activate DTP and in execute tab click execute button to load data in data targets.
4. Monitor
Right Click data targets and select manage and in contents tab select contents to view the loaded data. Alternatively monitor icon can be used.
BW 7.0
Uploading of Transaction data
Log on to your SAP
Transaction code RSA1u2014LEAD YOU TO MODELLING
5. Creation of Info Objects
u2022 In left panel select info object
u2022 Create info area
u2022 Create info object catalog ( characteristics & Key figures ) by right clicking the created info area
u2022 Create new characteristics and key figures under respective catalogs according to the project requirement
u2022 Create required info objects and Activate.
6. Creation of Data Source
u2022 In the left panel select data sources
u2022 Create application component(AC)
u2022 Right click AC and create datasource
u2022 Specify data source name, source system, and data type ( Transaction data )
u2022 In general tab give short, medium, and long description.
u2022 In extraction tab specify file path, header rows to be ignored, data format(csv) and data separator( , )
u2022 In proposal tab load example data and verify it.
u2022 In field tab you can you can give the technical name of info objects in the template and you not have to map during the transformation the server will automatically map accordingly. If you are not mapping in this field tab you have to manually map during the transformation in Info providers.
u2022 Activate data source and read preview data under preview tab.
u2022 Create info package by right clicking data source and in schedule tab click star to load data to PSA.( make sure to close the flat file during loading )
7. Creation of data targets
u2022 In left panel select info provider
u2022 Select created info area and right click to create ODS( Data store object ) or Cube.
u2022 Specify name fro the ODS or cube and click create
u2022 From the template window select the required characteristics and key figures and drag and drop it into the DATA FIELD and KEY FIELDS
u2022 Click Activate.
u2022 Right click on ODS or Cube and select create transformation.
u2022 In source of transformation , select object type( data source) and specify its name and source system Note: Source system will be a temporary folder or package into which data is getting stored
u2022 Activate created transformation
u2022 Create Data transfer process (DTP) by right clicking the master data attributes
u2022 In extraction tab specify extraction mode ( full)
u2022 In update tab specify error handling ( request green)
u2022 Activate DTP and in execute tab click execute button to load data in data targets.
8. Monitor
Right Click data targets and select manage and in contents tab select contents to view the loaded data. There are two tables in ODS new table and active table to load data from new table to active table you have to activate after selecting the loaded data . Alternatively monitor icon can be used
if it is helpful assign points
cheers
sivaRaju -
Unable to upload delimited flat file (Datastore)
Dear All,
Drilling down the tutorial: ODI11g: Creating an ODI Project and Interface: Exporting a Flat File to a Flat File
facing quite a problem while reverse-engineering delimited flat file (Datastore ->Model). No matter which type of delimited flat file is uploaded (txt, csv,prn), the system stayes intact while pressing "Reverse Engineering" button. Changing to Fixed Flat File option reveals that the system uploads correctly only the first row - Header of the columns (second option from the top)..and ...of cource, all the rest of the rows were placed in the first column.
I was preparing delimited flat files using Excel 2007 and 2010, in both cases no success. I tryied to create Fixed Flat File with Excel....confused completely since I could not realise how doing that?!
Properties are below. Please, help!
About
Oracle Data Integrator 11g 11.1.1
Standalone Edition Version 11.1.1
Build ODI_11.1.1.6.0_GENERIC_111219.1055
Copyright (c) 1997, 2011 Oracle. All Rights Reserved.
IDE Version: 11.1.1.6.38.61.92
Product ID: oracle.odi
Product Version: 11.1.1.6.0.0.0
Version
Component Version
========= =======
Java(TM) Platform 1.6.0_21
Oracle IDE 11.1.1.6.0
Properties
Name Value
==== =====
awt.toolkit sun.awt.windows.WToolkit
class.load.environment oracle.ide.boot.IdeClassLoadEnvironment
class.load.log.level CONFIG
class.transfer delegate
eclipselink.xml.platform org.eclipse.persistence.platform.xml.jaxp.JAXPPlatform
file.encoding Cp1251
file.encoding.pkg sun.io
file.separator \
ice.browser.forcegc false
ice.pilots.html4.ignoreNonGenericFonts true
ice.pilots.html4.tileOptThreshold 0
ide.AssertCheckingDisabled true
ide.AssertTracingDisabled true
ide.bootstrap.start 14794412430839
ide.build ODI_11.1.1.6.0_GENERIC_111219.1055
ide.conf C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\odi\bin\odi.conf
ide.config_pathname C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\odi\bin\odi.conf
ide.debugbuild false
ide.devbuild false
ide.extension.search.path jdev/extensions
ide.firstrun true
ide.java.minversion 1.6.0_04
ide.launcherProcessId 2932
ide.main.class oracle.ide.boot.IdeLauncher
ide.patches.dir ide/lib/patches
ide.pref.dir C:\Users\Администратор\AppData\Roaming\odi
ide.pref.dir.base C:\Users\Администратор\AppData\Roaming
ide.product oracle.odi
ide.shell.enableFileTypeAssociation C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\odi64.exe
ide.splash.screen splash.gif
ide.startingArg0 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\odi64.exe
ide.startingcwd C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client
ide.user.dir C:\Users\Администратор\AppData\Roaming\odi
ide.user.dir.var IDE_USER_DIR
ide.work.dir C:\Users\Администратор\Documents\odi
ide.work.dir.base C:\Users\Администратор\Documents
ilog.propagatesPropertyEditors false
java.awt.graphicsenv sun.awt.Win32GraphicsEnvironment
java.awt.printerjob sun.awt.windows.WPrinterJob
java.class.path ..\..\ide\lib\ide-boot.jar;..\..\..\..\oracledi.sdk\lib\ojdl.jar;..\..\..\..\oracledi.sdk\lib\dms.jar;..\..\jdev\extensions\oracle.odi.navigator\lib\log4j-1.2.8.jar;..\..\jdev\extensions\oracle.odi.navigator\lib\odi_hfm.jar;..\..\jdev\extensions\oracle.odi.navigator\lib\odihapp_common.jar;..\..\jdev\extensions\oracle.odi.navigator\lib\ess_es_server.jar;..\..\jdev\extensions\oracle.odi.navigator\lib\ess_japi.jar;..\..\jdev\extensions\oracle.odi.navigator\lib\odihapp_essbase.jar;..\..\jdev\extensions\oracle.odi.navigator\lib\odihapp_planning.jar
java.class.version 50.0
java.endorsed.dirs C:\Java\jre\lib\endorsed
java.ext.dirs C:\Java\jre\lib\ext;C:\Windows\Sun\Java\lib\ext
java.home C:\Java\jre
java.io.tmpdir c:\Temp\
java.library.path C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client;.;C:\Windows\Sun\Java\bin;C:\Windows\system32;C:\Windows;C:\app\master\product\11.2.0\dbhome_1\bin;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\
java.naming.factory.initial oracle.javatools.jndi.LocalInitialContextFactory
java.protocol.handler.pkgs null|oracle.odi.ui.url
java.runtime.name Java(TM) SE Runtime Environment
java.runtime.version 1.6.0_21-b07
java.specification.name Java Platform API Specification
java.specification.vendor Sun Microsystems Inc.
java.specification.version 1.6
java.util.logging.config.class oracle.core.ojdl.logging.LoggingConfiguration
java.vendor Sun Microsystems Inc.
java.vendor.url http://java.sun.com/
java.vendor.url.bug http://java.sun.com/cgi-bin/bugreport.cgi
java.version 1.6.0_21
java.vm.info mixed mode
java.vm.name Java HotSpot(TM) 64-Bit Server VM
java.vm.specification.name Java Virtual Machine Specification
java.vm.specification.vendor Sun Microsystems Inc.
java.vm.specification.version 1.0
java.vm.vendor Sun Microsystems Inc.
java.vm.version 17.0-b17
line.separator \r\n
LOG_FILE studio.log
native.canonicalization false
ODI_ORACLE_HOME C:\oracle\product\11.1.1\Oracle_ODI_1\
oracle.core.ojdl.logging.config.file ODI-logging-config.xml
oracle.home C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client
oracle.odi.studio.ess false
oracle.odi.wls.template.generator.agentEarLocation C:\oracle\product\11.1.1\Oracle_ODI_1\setup\manual\oracledi-agent\oraclediagent.ear
oracle.security.jps.config ./jps-config.xml
oracle.translated.locales de,es,fr,it,ja,ko,pt_BR,zh_CN,zh_TW
oracle.xdkjava.compatibility.version 9.0.4
org.apache.commons.logging.Log org.apache.commons.logging.impl.Jdk14Logger
os.arch amd64
os.name Windows Server 2008 R2
os.version 6.1
path.separator ;
python.cachedir c:\Temp\cachedir
python.home C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\jdev\extensions\oracle.odi.navigator\scripting
python.packages.paths java.class.path,sun.boot.class.path,odi.class.path
reserved_filenames con,aux,prn,lpt1,lpt2,lpt3,lpt4,lpt5,lpt6,lpt7,lpt8,lpt9,com1,com2,com3,com4,com5,com6,com7,com8,com9,conin$,conout,conout$
sun.arch.data.model 64
sun.boot.class.path C:\Java\jre\lib\resources.jar;C:\Java\jre\lib\rt.jar;C:\Java\jre\lib\sunrsasign.jar;C:\Java\jre\lib\jsse.jar;C:\Java\jre\lib\jce.jar;C:\Java\jre\lib\charsets.jar;C:\Java\jre\classes;C:\Java\lib\tools.jar;C:\Java\lib\dt.jar
sun.boot.library.path C:\Java\jre\bin
sun.cpu.endian little
sun.cpu.isalist amd64
sun.desktop windows
sun.io.unicode.encoding UnicodeLittle
sun.java2d.noddraw true
sun.jnu.encoding Cp1251
sun.management.compiler HotSpot 64-Bit Server Compiler
sun.os.patch.level Service Pack 1
user.country RU
user.dir C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\odi\bin
user.home C:\Users\Администратор
user.language ru
user.name master
user.timezone Europe/Moscow
user.variant
windows.shell.font.languages en
External Components
Name Version Path
==== ======= ====
HspJS.jar 11.1.1.1.0 Build 137 Built with CIS Drop 27 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\HspJS.jar
HspJS.jar 11.1.1.1.0 Build 137 Built with CIS Drop 27 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\jdev\extensions\oracle.odi.navigator\lib\HspJS.jar
HspJS_11.1.2.0.jar 11.1.2.0.01.10 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\HspJS_11.1.2.0.jar
HspJS_11.1.2.0.jar 11.1.2.0.01.10 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\jdev\extensions\oracle.odi.navigator\lib\HspJS_11.1.2.0.jar
activation.jar 1.1 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\activation.jar
binding-2.0.2.jar 2.0.2 2008-01-18 10:01:08 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\jdev\extensions\oracle.odi.navigator\lib\binding-2.0.2.jar
coherence.jar 3.7.1.1 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\coherence.jar
commons-beanutils-1.7.0.jar 1.6 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\commons-beanutils-1.7.0.jar
commons-codec-1.3.jar 1.3 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\commons-codec-1.3.jar
commons-collections-3.2.jar 3.2 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\commons-collections-3.2.jar
commons-discovery-0.4.jar 0.4 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\commons-discovery-0.4.jar
commons-io-1.2.jar 1.2 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\commons-io-1.2.jar
commons-lang-2.2.jar 2.2 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\commons-lang-2.2.jar
commons-logging-1.1.1.jar 1.1.1 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\commons-logging-1.1.1.jar
commons-net-1.4.1.jar 1.4.1 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\commons-net-1.4.1.jar
commons-vfs-1.0.jar 1.0 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\commons-vfs-1.0.jar
cpld.jar Version 11.1.2.0.00, Build 589, March 25 2010 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\cpld.jar
cpld.jar Version 11.1.2.0.00, Build 589, March 25 2010 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\jdev\extensions\oracle.odi.navigator\lib\cpld.jar
dbswing.jar 011.000.044.108 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\jdev\extensions\oracle.odi.navigator\lib\dbswing.jar
dx.jar 007.001.175.112 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\jdev\extensions\oracle.odi.navigator\lib\dx.jar
eclipselink.jar 2.3.1.v20111018-r10243 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\eclipselink.jar
enterprise_data_quality.jar 11.1.1.6.0 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\enterprise_data_quality.jar
enterprise_data_quality.jar 11.1.1.6.0 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\jdev\extensions\oracle.odi.navigator\lib\enterprise_data_quality.jar
ess_es_server.jar 11.1.2.0.0.615 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\jdev\extensions\oracle.odi.navigator\lib\ess_es_server.jar
ess_es_server.jar 11.1.2.0.0.615 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\ess_es_server.jar
ess_japi.jar 11.1.2.0.0.615 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\ess_japi.jar
ess_japi.jar 11.1.2.0.0.615 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\jdev\extensions\oracle.odi.navigator\lib\ess_japi.jar
fmwgenerictoken.jar 1.0.0.0 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\jdev\extensions\oracle.odi.navigator\..\..\..\..\..\odi_misc\fmwgenerictoken.jar
forms-1.2.0.jar 1.2.0 2008-02-23 08:37:50 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\jdev\extensions\oracle.odi.navigator\lib\forms-1.2.0.jar
groovy-all-1.7.4.jar 1.7.4 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\jdev\extensions\oracle.odi.navigator\scripting\groovy-all-1.7.4.jar
hsqldb.jar 2.0.0 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\hsqldb.jar
http_client.jar December 9 2011 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\http_client.jar
javax.security.jacc_1.0.0.0_1-1.jar 1.1 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\javax.security.jacc_1.0.0.0_1-1.jar
mail.jar 1.4 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\mail.jar
odi-sap.jar 10.1.3.12 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\odi-sap.jar
odi_hfm.jar 11.1.1.6.0 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\odi_hfm.jar
odi_hfm.jar 11.1.1.6.0 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\jdev\extensions\oracle.odi.navigator\lib\odi_hfm.jar
odihapp_essbase.jar 11.1.1.6.0 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\jdev\extensions\oracle.odi.navigator\lib\odihapp_essbase.jar
odihapp_essbase.jar 11.1.1.6.0 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\odihapp_essbase.jar
odihapp_planning.jar 11.1.1.6.0 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\odihapp_planning.jar
odihapp_planning.jar 11.1.1.6.0 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\jdev\extensions\oracle.odi.navigator\lib\odihapp_planning.jar
odihapp_planning_11.1.2.0.jar 11.1.1.6.0 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\odihapp_planning_11.1.2.0.jar
odihapp_planning_11.1.2.0.jar 11.1.1.6.0 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\jdev\extensions\oracle.odi.navigator\lib\odihapp_planning_11.1.2.0.jar
ojdbc6dms.jar 11.2.0.3.0 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\jdev\extensions\oracle.odi.navigator\lib\ojdbc6dms.jar
oracle.ucp_11.1.0.jar 11.2.0.3.0 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\oracle.ucp_11.1.0.jar
pop3.jar 1.1.1 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\pop3.jar
spring-aop.jar 2.0.3 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\spring-aop.jar
spring-beans.jar 2.0.3 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\spring-beans.jar
spring-context.jar 2.0.3 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\spring-context.jar
spring-core.jar 2.0.3 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\spring-core.jar
spring-dao.jar 2.0.3 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\spring-dao.jar
spring-jdbc.jar 2.0.3 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\spring-jdbc.jar
spring-jmx.jar 2.0.3 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\spring-jmx.jar
spring-jpa.jar 2.0.3 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\spring-jpa.jar
spring-web.jar 2.0.3 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\spring-web.jar
trove.jar 2.1.0 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\trove.jar
wlthint3client.jar 10.3.5.0 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi.sdk\lib\wlthint3client.jar
woodstox.jar 3.2.1 C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\jdev\extensions\oracle.odi.navigator\lib\woodstox.jar
Extensions
Name Identifier Version Status
==== ========== ======= ======
BM metamodel framework oracle.bm.meta 11.1.1.6.38.61.92 Loaded
Code Editor oracle.ide.ceditor 11.1.1.6.38.61.92 Loaded
Diagram Framework oracle.diagram 11.1.1.6.38.61.92 Loaded
Diagram Javadoc Extension oracle.diagram.javadoc 11.1.1.6.38.61.92 Loaded
Diagram Thumbnail oracle.diagram.thumbnail 11.1.1.6.38.61.92 Loaded
Extended IDE Platform oracle.javacore 11.1.1.6.38.61.92 Loaded
Groovy Support oracle.ide.groovy 11.1.1.4.37.58.91 Loaded
Help System oracle.ide.help 11.1.1.6.38.61.92 Loaded
Import/Export Support oracle.ide.importexport 11.1.1.6.38.61.92 Loaded
Index Migrator support oracle.ideimpl.indexing-migrator 11.1.1.6.38.61.92 Loaded
JViews Registration Addin oracle.diagram.registration 11.1.1.6.38.61.92 Loaded
Log Window oracle.ide.log 11.1.1.6.38.61.92 Loaded
Modeler Framework oracle.modeler 11.1.1.6.38.61.92 Loaded
Modeler Framework Common Layer oracle.modeler.common 11.1.1.6.38.61.92 Loaded
Navigator oracle.ide.navigator 11.1.1.6.38.61.92 Loaded
ODI Navigator oracle.odi.navigator 11.1.1.6.0.0.0 Loaded
Object Gallery oracle.ide.gallery 11.1.1.6.38.61.92 Loaded
Oracle Data Integrator oracle.odi 11.1.1.6.0.0.0 Loaded
Oracle IDE oracle.ide 11.1.1.6.38.61.92 Loaded
Peek oracle.ide.peek 11.1.1.6.38.61.92 Loaded
Persistent Storage oracle.ide.persistence 11.1.1.6.38.61.92 Loaded
Property Inspector oracle.ide.inspector 11.1.1.6.38.61.92 Loaded
Runner oracle.ide.runner 11.1.1.6.38.61.92 Loaded
Virtual File System oracle.ide.vfs 11.1.1.6.38.61.92 Loaded
Web Browser and Proxy oracle.ide.webbrowser 11.1.1.6.38.61.92 Loaded
audit oracle.ide.audit 11.1.1.6.38.61.92 Loaded
jdukshare oracle.bm.jdukshare 11.1.1.6.38.61.92 Loaded
mof-xmi oracle.mof.xmi 11.1.1.6.38.61.92 Loaded
oracle.ide.indexing oracle.ide.indexing 11.1.1.6.38.61.92 Loaded
palette2 oracle.ide.palette2 11.1.1.6.38.61.92 LoadedYou might want to look at the documentation related to Microsoft Excel Microsoft Excel - 11g Release 1 (11.1.1)
Maybe you are looking for
-
Custom Authentication Issue with Policy Agent
Hi, I have a custom authentication module which is hosted on the BEA application server and I am trying to access through the policy agent on apache. I have set the following property in AMAgent.properties file com.sun.am.policy.am.loginURL= http://h
-
"number of items in all folders" setting keeps changing
Hello I have the problem, that a customer wants to see the number of mails in an outlook folder (exchange). When I set the option "number of items in all folders" it works - but only as long, as Outlook is running... As soon as I close Outlook, it c
-
Desktop Manager 5.0 closes when synching Calendar with Lotus Notes 8
I'm running Windows XP, just installed Desktop Manager v5.0 and upgraded to Lotus Notes 8 and trying to synch to a BB Storm. I continually get an error if I try to do a two way synch with my calendar only. If I do a 1 way Synch to my Device with or
-
EOS 5D Mark III ratings don't transfer to LR 4.1
I can't see that ratings transfer to Adobe Lightroom 4.1, but they do transfer to Canon DPP just fine. I think I've looked through all EXIF and IPTC fields and can't find the ratings data anywhere in LR. Where is it supposed to show in Lightroom? My
-
2013 Workflows - Trigger a workflow from a Content Type
So now that a workflow cannot be triggered from a Content Type, how are people doing this now? Any suggestions appreciated. Thanks J Sykes