Manage flat files before/after load them
Hi all,
I am a beginner on OWB. I have to migrate a DW environment that all scripts are implemented manually (PL/SQL) and running over UNIX OS to a OWB. Currently the data is loaded from flat files on a temp (stg) tables (first phase). There is an external process that is leaving in a folder more than one file of the same entity, then one sh process is doing a join (cat) of all files in one. For example:
customers1.txt
customers2.txt
customers3.txt
result: customers.txt
The same process calls SQL-Loader to load the file into the temp table. After the file is loaded it is moved into a backup folder.
I have created a mapping that is loading the data from a flat file to a staging table. My question is if it is possible execute some SH file from OWB in order to do the cat of the files and move the result file into a backup folder when the loading process is finished.
Many thanks for your inputs!
Castella V.
I think that I can do it with Experts, using a Java_Task. The problem is that I cannot find any example about Java_task implementation. Could someone provide me some example?
Thanks in advance!
Similar Messages
-
Hi All,
we have been using OWB 10.2.0.3 version from past 6 months till now we are loading database to database.
Data base version : 10.2.
OWB version : 10.2.0.3
Server operating system : Sun Solaries (10)
Client operating system : Windows Xp.
I would like to share some information to u and all.
i have installed owb10gr2 on personal PC it has windows Xp operating system. i have one simple mapping for FLAT file.
Steps handled by me to populate data flat file to Database.
i have created one folder in C drive then i have created directory and i gave privilages to access that folder.
i have created external table for flat file manually.
after that i imported external table to mapp target table. it's working fine and data also loading.
i am still getting confuse Flat file handling with SUN SOLARIES OPERATING SYSTEM
How do we handle flat file in UNIX environment.
How create directory and how to give access privilages.
how to create EXTERNAL table with OWB.
Could u please help me
Regards.
venkat.Hi,
i have logged in sys and i created directory then gave privilages to 'c:\test_file'.... test_file is folder name which i created folder in windows PC. server is SUNSOLARIS.
in OWB i have created table based on that file with delimited. below mentioned column names and values.
no name sal coom
100 aa 10000 1000
200 bb 20000 2000
300 cc 30000 3000
400 dd 40000 4000
after that i have created external table under source module and then deployed that table in database table has created sucessfully but while using select statement it's showing error.
ex: select * from table ( error)
ORA-29913 error in executing ODCIEXTTABLEOPEN callout
ORA-29400 data cartridge error
KUP-04040 file abc.csv in test_v_s_loc not found
ORA-06512 at "sys.ORACLE_LOADER",line 19
i have done/given privilages like below
create or replace directory sample_owb as 'C:\test_file'
grant read,write on directory sample_owb to ods_inb_stg
what is the cause and how to handle and resolve. could u please help me.
regards,
venkat. -
Track flat files that failed loading in sql loader
Hi,
Can anyone please suggest me any way to track the flat files which failed while loading in sql loader. And pass this failed flat file name to any column of a database table
Thanks in advance.
Edited by: 806821 on Nov 2, 2010 10:22 AMHi Morgan thnannks for ur reply.
Define failed. 1 row not loaded ... no rows not loaded ... what operating system ... what version of the Oracle database ... track in a table, send an email?
Your inquiry is long on generalities and short on specifics: Fill in all the blanks ... not just the ones I've listed above.
even if 1 row is not loaded even then it should be considered failed to load and the name of the that particular flat file should be fetched.
Operating system is unix
Oracle database we are using is R12
track in a table , yeah we want to send an email notificaiton whenever the flat files fails to load.
Thanks once again...!! -
How can I view my 6D raw files before I open them in post processing?
How can I view my 6D raw files after I load them on my computer & before I open them on a post processing Program?
I have no problem doing that with my 1DX,1DIV or 5Dmk III. This problem exist with any sdhc card I use on the 6D, but not on any other camera with the same cards.
Solved!
Go to Solution.Your computer doesn't have camera RAW support installed for a 6D. RAW isn't really a standard-format... it's more of a concept. Every individual camera has it's own unique RAW format (even though they all have the same .CR2 extension).
Each time a new camera comes out, companies that make RAW processing software will make an update available that adds that camera. You'll need to get that update and install it.
Tim Campbell
5D II, 5D III, 60Da -
Find out duplicate rows in a flat file before using sqlldr
Hello i want to import via sqlldr from a flat file to a table inside my data base. My flat file has unfortunately some duplicate copies inside. So, i defined my upload table with two primary keys- date and time(and sometimes there are more than one row with the same time and date inside the flat file). These primaries are important to me because i want to use them for later tables and i can't use the direct path and parallel method by using primaries.
Is there any tool which can find duplicate copies before i use sqlldr. But, the special case here is, that the rows not really duplicated but date and time rows twice. And for my interest it isn't necessary whether there are different values in the second row of the same date and time. The file contains data which is monitored every second and that's enough.
It would be nice if someone could help me
cheersI simply upload from sqlldr to staging tables first.
The staging tables allow duplicates then I do what I need to do in regards to duplicates or missing data during the transfer from the staging tables to the real tables.
The staging tables are also global temporary tables so I don't have to worry about cleaning them up or multiple sessions trampling each other.
I have also used an external table on the datafile instead of sqlldr, this allows me to get rid of the staging table. But that is only good for very small datasets in the file being loaded. -
Flat File to InfoCube Load Issue
Hello All,
I am newbie in the SAP BW land. Probably a simple question to BW Experts.
This is crude implementation of the Fu and Fu example.
Environment SAP BW 7.0
Iam trying to load a flat file to the Info Cube.
1. I am able to successfully preview the data from the data sources
2. Also able to load the data to the PSA, which i see from PSA maintenance screen
Issue arises when i try to load the data to the Info Cube by running the DTP Process, it just does not load the data.
I ma unable to figure of what the issue is
Following the structure of the Flat File with 1 line of data
Day Unit of Measure Customer ID Meterial Number Sales Representative ID Sales Office Sales Region Price of Material Sales quantity Sales revenue Currency
20100510 gms 1122334455 92138 9710274 Chicago NorthEast 890 50 928 USD
Following is the structure of the Info Cube
Characteristics
Time
Unit
Customer ID
Material Number
Sales Representative ID
Sales Office
Sales Region
Measures
Price of Material
Sales quantity
Sales revenue
--- also where can i find a data flow diagram or steps of what needs to be done to design a standard cube loading process
Please tell me where am I going wrong.II think i made progress but i still do not see the data in the InfoCube
When i visit the PSA by
visiting the source system -> Manage PSA -> Select the line item -> PSA Maintenance-> I see the record completely like this
@5B@ 1 1 05/10/2010 GMS 1,122,334,455 92,138 9,710,274 CHICAGO NORTHEAST 890 50 928
But when i run the DTP Process i see the following error message
I think this is due to a data type mismatch between flat file and the infocube.
Please adivse
Error I see is
@5C@ No SID found for value 'GMS ' of characteristic 0UNIT BRAIN 70 @35@ @3R@
Also where can i click or extract the InfoCube structure in plain text so that i can post it on the forum for your better visibility -
Change InfoObject value in flat file for Hierarchy load
Hi Guys,
I am loading hierarchy for an infoobject from flat file. I have the field InfoObject (IOBJNM) whose value needs to be changed before loading into the hierarchy. Can somebody tell me how can I do that through formula or routine.
The InfoObject loads 0HIER_NODE and a leaf infoobject values from flat file.
The hierarchy transfer rules does not contain IOBJNM field., though the Hier Structure under Transfer Structure tab contains IOBJNM.
In transfer structure I see only NodeName, Date From, Date To, and eaf infoobject.
Any help would be appreciated.Hi,
Try using the IDOC method while loading the hierarchy and make sure you have this IOBJNM in the external charactersitics for the Infoobect to which you are loading this hier.
Thanks,
Peter -
Output in flat file master data loading
Hi gurus,
iam loading master data to an info object from a flat file. In the output the headers ( info objects) names i was seeing only long descriptions, but not technical names given by me. How to see the technical names of the info objects in the output.Hi Ram,
I am not getting clearly your question.
What i got is
You are able to see the description in the preview in External Data tab. But you are not able to get the corrosponding values after loading. (The same problem i gone thr.)
Soln :
Check the language field. Put it to English ( E ). and try to load the package again.
Note Please check the Preview and simulated preview also.
Let me know if problem persists.
Regards
Vinay -
Not loading from flat file using SQL*Loader
Hi,
I am trying to load from an excel file.
first i converted excel file into csv file and save it as as dat file.
in the excel file one column is salary and the data is like $100,000
while converting xls to csv the salary is changed to "$100,000 " (with quotes and a space after the amount)
after the last digit it will put a space.
in the control file of sql*loader i had given
salary "to_number('L999,999')"
my problem is the space after the salary in the dat file.---> "$100,000 "
what changes i have to make in the to_number function which is in the control file.
Please guide me.
Thanks & Regards
Salih KM
Message was edited by:
kmsalihThanks a lot Jens Petersen
It's is loading ..........
MI means miniute.
am i correct.
but i didn't get the logic behind that.
can u please explain that.
Thanks & Regards
Salih KM -
Flat File: no data load into Info Cube
Hi there,
i try to load a flat file. When I simulate the upload its works well. But no Data was load in my Info Cube. When I try define a query there are no available.
Can someone provide me with a solution for this problem?
With rgds
Oktay DemirHi Oktay,
in addition to A.H.P.'s marks, check if
- Data is posted not only into PSA but also into datatarget,
- updaterules are active.
- Check Monitor-Status in Cube-Administration
- Check availabilitiy for reporting of the request wthin Cube-Administration.
Cheers
Sven -
How many errors in data file before data load is aborted?
Is there a limit to how many incorrect records you can have when loading a free form file before Essbase aborts the load process?
I am trying to export a complete database to another database where the cost center level detail is removed, i.e. all parents exist (and data has been exported using All Levels).
ThanksI have done it this way in the past:
1-Create a dummy scenario.
2-Write a calc script to copy data to that scenario, but only fix on the blocks you want to extract.
3-Delete all other scenarios, thus leaving ONLY the data you copied to your dummy scenario.
4-Rename the dummy scenario to the scenario name you need.
5-Export the full database to a file.
6-Load the export into your destination database.
This might work but not knowing your exact source and target hierarchies, this may not work..... -
Flat file transactional data load error
I am trying to load a flat into a BPC model. I am using a DM " Import transational data load from flat file" to load the data. The transformation file and data file are validated correctly.
However,when i am running the DM package, i am hitting the below message.
Task Name : Load
Cannot perform Read.
Model : Package Status : Error.
We have two secure dimension in the model. I have tried different combinations even with PRimary admin role and i am still getting the same error message.
Is this a secirity related error ? The model has been transported from DEV . In DEV, i am not facing any errors.
Any advise/ help?Hi King,
I think in the back end you need check the option real time load behavior, Is it in planning mode or loading mode. If possible share your error message screen.
Goto RSA1 ---> select your cube--> Right click---> change real time load behavior--> Change it in to planning mode
Regards,
Saida Reddy Gogireddy -
IDVD Crashes After Loading Themes Bar
I open up iDVD and I create a new project, save it, then the loading themes bar shows up, disappears, and then nothing else happens. I have to force quit in order to get out of the program.
I tried the solution that is already on the site (deleting themes 1-4) and it still doesn't work.
I have a blackbook 2.2SR which I purchased in November. iDVD worked when I first got it, but this problem started around beginning of January. No idea what to do since this is my first Mac.
Any help would be appreciated.
Thanks.Try this basic troubleshooting fix :
1 - delete the IDVD preference file, com.apple.iDVD.plist, that resides in your
User/Home/Library/ Preferences folder.
2 - delete IDVD'S cache file, Cache.db, that is located in your
User/Home/Library/Caches/com.apple.iDVD folder.
3 - launch IDVD and try again.
NOTE: In Lion and Mountain Lion the Home/Library folder is now invisible. To make it permanently visible enter the following in the Terminal application window: chflags nohidden ~/Library and hit the Enter button - 10.7: Un-hide the User Library folder.
OT -
Problem in Flat file transaction data loading to BI7
Hi Experts,
I am new to BI7, got problem in activating data source for transaction data and create transfer routine to calculate sales revenue which is not there in flat file but flat file has got price per unit and quantity sold.
Flat file fields are:
Cust id SrepID Mat ID Price per unit Unit measure Quantity TR. Date
cu100 dr01 mat01 50 CS 5 19991001
created info objects are
IO_CUID
IO_SRID
IO_MATID
IO_PRC
IO_QTY
IO_REV
0CALDAY
When i created IO_QTY, unit/curency was given as 0UNIT. In creation of DS, under fields tab, extra field CS was shown I did confuse to which object should i map it. and at the same time when i enter IO_QTY its prompting one dailogbox with 0UNIT, so i said ok. Now i can't map CS to unit as system created automatic ally 0UNIT when i enter IO_QTY. Could you please give me solution for this and procedure to enter formulae for calculating Sales revenue at this level instead of query level. Your solutions will be more appreciated with points.
Await for ur solutions
Thanks
ShrinuHi Sunil,
Thanks for your quick response. I tried to assign the infoobjects under the fields tab in creation of DS. I have not reached to data target yet.
Could you please answer to my total question.
Will you be able to send some screen shots to my email id [email protected]
Thanks
Shri -
RMAN in 10g database deletes archivelog files before standby receives them
Hi all,
We currently have problem with our Oracle 10gR1 database on Windows 2000 server in that the RMAN backups on the primary database delete archivelogs before the standby database can receive current and new archivelogs from primary database. What happens is that RMAN backs up the archivelogs and deletes them from disk before the archivelogs are sent to the standby. The standby then looks for these archivelogs on the primary but is unable to locate them. Whenever this happens, the production database (primary) hangs and we have to restart the instance on the primary. We have our Data Guard setup in maximum performance mode so this should not happen.
As a short term fix, we have changed the backups to have RMAN backup and delete the archive logs 15 minutes older than current time so that the standby can receive the logs and not have problems. Besides this fix is there a long term solution to the problem or is it a bug in Oracle 10g? The issue came up after we upgraded from 9i to 10g and never saw it before in 9i with RMAN and Data Guard physical standby databases.
Thanks
Ben Prusinski, Oracle DBAthanks for answer...
did you managed to make it 'work'?
according to doc. if set on standby it should:
<>
Then, these files are eligible for deletion:
Archived redo log files in the flash recovery area that
were applied on the standby database.
<>
and this is to my understanding irespective to retention policy...unfortunately report obsolete does not report applied archived logs as eligable for deletion.
How this works in your case?
regards.
goran
Maybe you are looking for
-
Free characteristics Functioanlity is not working please advice
HI I got an problem in the report with the multiprovider ZCOPA_M01 If I open the report for this multiprovider and there are 2 default characteristics displayed in the query and after removing the drill down for those two characteristics and Now IF I
-
How to get images to show in browsers after upload Muse 4
After installing the new Muse 4 I uploaded my new site. None of the images show up in any browser I have. How do I solve this problem. www.lindabrewster.com Thanks Linda
-
Steps to configure XML Gateway in Oracle
Steps to configure XML Gateway in Oracle
-
I downloaded an upgrade for my 10.6.8 and my MAC is running SUPER slow. My goal is to have my software updated enough to use garminexpres. What should I do?
-
is it possible that, if the forms is in execute_query mode (i.e it has retireved some results from the database) then automatically it disables the delete allowed propert of the block. another is, how can i find the current position of my text field.