Master data Load from Oracle table
Hi,
My master data comes from an Oracle table. The table has both attributes and texts combined. How do I load the Master data into the text and attribute info objects thru Direct Upload of Master Data?
Is it necessary to go in for Flexible Upload in this case?
Regards
Hi,
you can create two views on the table. One for the text and the other one for the attributes. Create your datasources on these views and assign them to your infoobject.
regards
Siggi
Similar Messages
-
BPC:: Master data load from BI Process chain
Hi,
we are trying to automatize the master data load from BI.
Now we are using a package with:
PROMPT(INFILES,,"Import file:",)
PROMPT(TRANSFORMATION,%TRANSFORMATION%,"Transformation file:",,,Import.xls)
PROMPT(DIMENSIONNAME,%DIMNAME%,"Dimension name:",,,%DIMS%)
PROMPT(RADIOBUTTON,%WRITEMODE%,"Write Mode",2,{"Overwirte","Update"},{"1","2"})
INFO(%TEMPNO1%,%INCREASENO%)
INFO(%TEMPNO2%,%INCREASENO%)
TASK(/CPMB/MASTER_CONVERT,OUTPUTNO,%TEMPNO1%)
TASK(/CPMB/MASTER_CONVERT,FORMULA_FILE_NO,%TEMPNO2%)
TASK(/CPMB/MASTER_CONVERT,TRANSFORMATIONFILEPATH,%TRANSFORMATION%)
TASK(/CPMB/MASTER_CONVERT,SUSER,%USER%)
TASK(/CPMB/MASTER_CONVERT,SAPPSET,%APPSET%)
TASK(/CPMB/MASTER_CONVERT,SAPP,%APP%)
TASK(/CPMB/MASTER_CONVERT,FILE,%FILE%)
TASK(/CPMB/MASTER_CONVERT,DIMNAME,%DIMNAME%)
TASK(/CPMB/MASTER_LOAD,INPUTNO,%TEMPNO1%)
TASK(/CPMB/MASTER_LOAD,FORMULA_FILE_NO,%TEMPNO2%)
TASK(/CPMB/MASTER_LOAD,DIMNAME,%DIMNAME%)
TASK(/CPMB/MASTER_LOAD,WRITEMODE,%WRITEMODE%)
But we need to include these tasks into a BI process chain.
How can we add the INFO statement into a process chain?
And how can we declare the variables?
Regards,
EZ.Hi,
i have followed your recomendation, but when i try to use the process /CPMB/MASTER_CONVERT, with the parameter TRANSFORMATIONFILEPATH and the root of the transformation file as value, i have a new problem. The value only have 60 char, and my root is longer:
\ROOT\WEBFOLDERS\APPXX\PLANNING\DATAMANAGER\TRANSFORMATIONFILES\trans.xls
How can we put this root???
Regards,
EZ. -
Mainframe data loaded into Oracle tables - Test for low values using PL/SQL
Mainframe legacy data has been copied straight from the legacy tables into mirrored tables in Oracle. Some columns from the mainframe data had 'low values' in them. These columns were defined on the Oracle tables as varchar2 types. In looking at the data, some of these columns appear to have data that looks like little square boxes, not sure but maybe that is the way Oracle interprets the 'low values' in the original data into varchar. When I run a select to find all rows where this column is not null, it selects these columns. In the results of the select statement, the columns appear to be blank, however, in looking at the data in the column using SQL Developer, I can see the odd 'square boxes'. My guess is that the select statement is detecting that something exists in this column. Long story short, some how I am going to have to test this legacy data on the Oracle table using Pl/Sql to test for 'low values'. Does anyone have any suggestions on how I could do this????? Help! The mainframe data we are loading into these tables is loaded with columns with low values.
I am using Oracle 11i.
Thanks
Edited by: ncsthbell on Nov 2, 2009 8:38 AMncsthbell wrote:
Mainframe legacy data has been copied straight from the legacy tables into mirrored tables in Oracle. Not a wise thing to do. Mainframe operating systems typically use EBCDIC and Unix and Windows servers use ASCII. The endian is also different (big endian vs little endian).
Does anyone have any suggestions on how I could do this????? As suggested, use the SQL function called DUMP() to see the actual contents (in hex) of these columns. -
Data Load to Oracle Tables.
Dear User,
I have a requirement where I need to create a new Apex application that will allow the Planning department to create Item Lists and Locations lists. They will build the lists based on values (location or Item) that they paste from their Spreadsheets into the Apex App.
Location list will be in the below Format.
"02-05-07-14-15-19-20-21-23-25-27-28-44-47-53-54-61-63-65-66-68-69-74-75-77-79-81-85-86-87-89-90-92-95-96-97-98-99-101-103-105-107-111-112-116-119-120-127-130-133-139-143-147"
And Item List will be in the below Format.
7310468
1009521
4490033
4490156
4490318
4490334
4490059
4490083
I need to load the Location List Data and Item_list Data into two different tables "Loc_List_stage" and "Item_list_stage". Every Location should be on a different line and same with the Item List.
Any Help or suggestion will be appreciated.
Thanks!
NSHi,
Go to: SQL Workshop -> Utilities -> Data Workshop
In the Data Load section choose: Text Data
Load To : Existing table
Load From: Copy and paste
Next choose the schema in which your table is present in which you want to load data
and then select the table name.
Then copy and paste the data to be uploaded and then can upload the data.
See if it works!
Regards,
Kiran -
Need to load large data set from Oracle table onto desktop using ODBC
I don't have TOAD nor any other tool for querying the database. I'm wondering how I can load a large data set from an Oracle table onto my desktop using Excel or Access or some other tool using ODBC or not using ODBC if that's possible. I need results to be in a .csv file or something similar. Speed is what is important here. I'm looking to load more than 1 million but less than 10 million records at once. Thanks.
hillelhalevi wrote:
I don't have TOAD nor any other tool for querying the database. I'm wondering how I can load a large data set from an Oracle table onto my desktop using Excel or Access or some other tool using ODBC or not using ODBC if that's possible. I need results to be in a .csv file or something similar. Speed is what is important here. I'm looking to load more than 1 million but less than 10 million records at once. Thanks.
Use Oracle's free Sql Developer
http://www.oracle.com/technetwork/developer-tools/sql-developer/downloads/index.html
You can just issue a query like this
SELECT /*csv*/ * FROM SCOTT.EMP
Then just save the results to a file
See this article by Jeff Smith for other options
http://www.thatjeffsmith.com/archive/2012/05/formatting-query-results-to-csv-in-oracle-sql-developer/ -
Data loading from one table to another
Hi,
I want to load some data from a temp table to a master table. The master is having 40million records and the temp table is having 23 million records. Master table is having around 50 columns and we are adding 4new columns and the temp table is having 5columns. The data for these 4new columns are available in the temporary table also the employee column is there in common to these two table.
I used a stored procedure to load the data, whcih uses a cursor. But its taking more that 6hours to load.
Can any one suggest me a good technique to load data faster?
Thanks,
Santhosh.hi consider this case scenario which matches with yours.
first of all you have to update not insert in master table.
master table = emp with columns (emp_id, emp_name, emp_designation)
to this original master table you added two more columns emp_salary, emp_department
so now your master table looks like emp_id, emp_name, emp_designation, emp_salary, emp_department
but when you do select * from master table, the last two columns salary & department are blank.
Now you have another temp table with folllowing columns (emp_id, emp_salary, emp_department)
now emp_id is common to master & temp tables & you want to put values from temp table into master tables? I think this is what ur trying to do..
so for the above case the query i would write is
update master_table m set m.emp_salary=(select t.emp_salary from temp_table t where
t.emp_id=m.emp_id);
commit;
Regds. -
Master Data Loading from APO--- BW System
Hi Experts,
I am loading master data from the APO to bW System,after loading data into the BW System ,I found the number of data records are mismatching
ex:APO (8340) --->BW (9445)
and that to I tried to delet the data in InfoObject,
only some records are deleting
and one more doubt
in APO i have info object ZA9MATNR CHAR 40
With Conversion Routine PRODU
in BW i have the same info object ZA9MATNR CHAR 40 with Conversion Routine ALPHA.
and I have checked the Check box in transfer rules also,may be becoz of this I am getting more records.
Could any body help me
Thanks in Advance...
Regards,
Nagamani.Hi,
Ideally it is impossible to get the increased count of data records when you load it into BW unless you are using the RETURN TABLE concept.
Normally the record count decreases with the records with same keys gets overwritten/added.
Coming to your second question, ALPHA conversion routine has nothing to do with the record count. It appends the ZEROS to the value coming in that InfoObject. For example, if the lenght of the object is 5 and data is coming only in 3 digits, it will append the ZEROS to make its length 5.
Hope this info helps you.
Regards,
Yogesh. -
Hi ,
This is a simple question.
I am planning to load data from Oracle. What connection info should I need to acheive this?
When I select 'Open SQL' from EAS console, Open SQL datasource dialog box contains server name, application name and database name, what should be entered for these?Hi,
Please refer the below link to create an ODBC connection in Linux/Unix box.
http://publib.boulder.ibm.com/infocenter/c8bi/v8r4m0/index.jsp?topic=/com.ibm.swg.im.cognos.rdm_cr_updates.8.4.0.doc/rdm_cr_updates_id1183SettingUpODBCConnections.html
h4. Open SQL datasource dialog box contains server name, application name and database name, what should be entered for these?
Here you have to provide the server address or complete server name where the Oracle is installed and in the application name and database name you have to provide the Database name and the table name what you have created in the Oracle.
Hope this helps.
Greetings
SST..... -
Data transfer from Oracle table to Sql Server
Hi Friends,
How can we transfer the data of one table from Oracle to Sql Server and keep refresh it everyday monrning 10 a.m?Also see:
http://technet.oracle.com/docs/products/oracle8i/doc_library/817_doc/server.817/a76960/hs_genco.htm#173
And from Tom Kyte:
...you could put the MS SQL server jdbc driver into the oracle database. then a
java stored procedure can open a connection to MS. This connection would not
have 2 phase commit abilities, distributed query or anything but would let you
query data from there and update it as well. -
Master data load from 2 source system
Hi all,
I am working on a project which has 2 source systems.
Now I have to load master data from 2 source systems without making 0LOGSYS as compunding
attribute. Because all the objects which i want to use compunding attribute are reference
objects for lot of other info objects. So, i dont want to change the business content
objects structure. Please guide me
ThanksHi,
I cube there is nothing to do with compounding attribute, it will be handled in MD IO level. As your requirement to separate the transaction happen in different source system then make add another info object as Zsource and make it as nav. attribute then design your report based on nav. attribute.
Note that this separation is only possible in Material Related transaction not for others.
Thanks
BVR -
Selective Deletion and Data Load from Setup Tables for LIS Extractor
Hi All,
We came across a situation where one of the delta in PSA was missed to load in DSO. This DSO updates another cube in the flow. Since it has been many days since this miss come in our knowledge we need to selectively delete data for those documents from DSO & Cube and then take a full load for the Documents filling the setup table.
Now what will be the right approach to load this data from setup table > DSO > cube. There is change log present for those documents and a few KPI's in DSO are in summation mode.
Regards
Jitendrathanks Ajeet!!!!
This is Sales Order extractor, the data got loaded to ODS just fine, but since the data is coming to ODS from different extractor. Everything is fine in ODS, but not in the cube. Will Full repair request and Full load would it make difference when the data is going to cube? I thought that it would matter only if I am loading to ODS.
what do you mean "Even if you do a full load without any selections you should do a full repair ".
thanks.
W -
Load from Oracle table to FIle
Hi,
I have a requirement where i need to load the data from a table to a file.
I need to convert all the columns into rows (Pvot and UnPivot) Does ODI support this. I have converted the columns to a single row using a procedure. i want to achive this using an interface. Say 1
2
3
4 is the column i need it to be 1,2,3,4.
What is the best way to achieve this?
Please give me some inputs for this.
Thanks in Advance,
KhaleelHi ,
You can refer Metalink Note 423694.1 for Pivoting using ODI.
Thanks,
Sutirtha -
Hierarchy for Master Data Load from BW
Hi All
I'm loading Master data from BW infoobject 0Customer. There is no Hierarchy maintained for it in BW.
Rather than creating hierarchy manually in BPC, I want to include mapping in transformation/conversion file which should roll master data records coming from BW to ALL_CUSTOMERS (member manually maintained in BPC). ALL_CUSTOMERS will be parent node for the hierarchy in BPC. No other levels of hierarchy is required. All the data should have one parent.
Something like PARENT=*STR(ALL_CUSTOMERS)
Please share your ideas? Help is very much appreciated.
Thanks.Hi Asif
If you are laoding master data from BW infoobject, you will not be able to load /specify any value for ParentH1/hierarchy node as BW /BPC does not allow that.
An option would be to
1) load hierarchy from a flat file or
2) to maintain hierarchy in BW amd load into BPC the normal way
Regards
Surabhi -
Data Loading from one table to another in the Same Database based on conditions .
Hi ALL ,
I have 2 tables Products and Product_info .
Product_info table Product_id is Primary key but not an identity column so auto increment of number needs to be performed from the Package only .
Requirement is :
IF the Product_ID is = 20 and Date lies in the previous month not the current month in the Products table then
insert into the Product_info table based on below mentioned information .
1.If the Name has tap then ignore it completely don't perform any insert for it.
2.If the Name has Zork in it then perform 2 inserts in the Product_info table having Product_info_id's 1 and 2 .
3.If the Name doesn't contains Zork or tap insert it in the Product_info table having Product_info_id 4.
Very new to SSIS package development it will be helpful if you can provide detailed information .
Source Table (Products table )
ID
NAME
Product_ID
Date
Area_ID
1
P_tap_rus
20
13-01-2014
3
2
Enc_sap_top
10
15-01-2014
4
3
Yorl
20
05-02-2014
5
4
zork
20
20-01-2014
6
5
fadbt
10
22-01-2014
6
6
xyzzz_oprt
20
28-01-2014
5
7
def_type_ru
20
06-02-2014
2
8
acd_inc_tup
10
07-02-2014
3
9
bnf_dlk_fbg
20
03-02-2014
4
10
rtyui_vnmghj_sfdl
10
12-01-2014
5
11
wlwf_10103_123
10
04-02-2014
9
Destination table (Product_info)
Porduct_ID
ID
Area_ID
Product_info_ID
Column1
1
3
5
4
As NameString doesn’t contain Zork or Tap
2
4
3
1
As Id is 4 so 2 inserts one for 1 and other for 2 in the Product_info_id column
3
4
3
2
4
6
5
4
5
10
5
4
6
11
9
4
Please let me know if any other information is required .
Thanks
PriyaHi Priya,
You mentioned this was coming from two tables right? I believe I would try to perform the transformations with T-SQL in my source (If this is a possibility for you). Below is an example of something you could do.
WITH CTE
AS
SELECT ID, Product_ID, [Date], Area_ID,
CASE
WHEN Name like '%Zork%' THEN 1
ELSE 4
END AS Product_Info_ID
FROM [YourTable]
WHERE Product_ID = 20 and MONTH([DATE]) = MONTH(DATEADD(MM, -1, GETDATE())) AND NAME NOT LIKE '%tap%'
SELECT *
FROM CTE
UNION
SELECT ID, Product_ID, [Date], Area_ID, '2' AS Product_Info_ID
FROM CTE WHERE Product_Info_ID = 1
I hope this helps, Regards. -
Data loading from Oracle to SQL Server
Hi,
I am trying to Push the data from Oracle which is running in HP-UNIX to SQL Server. But don’t know the efficient way to connect the SQL server from oracle which is running on HP-UNIX.
I have heard about oracle Heterogeneous Connectivity but don’t know exactly how to implement it in Unix environment. If you have some guide of step by step process it will be really helpful.
Thanks in advance.
Regards,
SajalHello,
please start reading here about the Oracle Database Gateways products:
http://www.oracle.com/technetwork/database/gateways/index-100140.html
HowTo articles are available on My Oracle Support:
How to Setup DG4ODBC on 64bit Unix OS (Linux, Solaris, AIX, HP-UX) (Doc ID 561033.1)
How to Setup DG4MSQL (Oracle Database Gateway for MS SQL Server) 64bit Unix OS (Linux, Solaris, AIX,HP-UX) (Doc ID 562509.1)
Please read also this note:
Functional Differences Between DG4ODBC and Specific Database Gateways (Doc ID 252364.1)
The license for DG4ODBC is included in your RDBMS license, but you need to purchase a Third-Party ODBC driver for MS SQL Server.
The license for DG4MSQL is not included in your RDBMS license.
There is also a forum for the gateways:
Heterogeneous Connectivity
Regards
Wolfgang
Maybe you are looking for
-
Calculate size of table with BLOB column
Hello, when trying to calculate the occupied space for a table, I'm using DBA_SEGMENTS, which works fine as long as the table does not have a BLOB column. As far as I can tell, the size of the BLOB data is stored with the SEGMENT_TYPE = 'LOBSEGMENT',
-
I'm trying a full restore in Lion, but it won't. States that there isn't enough space on the Mac HD. How do I restore everything from a recent Time Machine Backup?
-
Version tables in Oracle Designer 6i ????????
Hi all, I would like to create some reports out of the version history of my DB. Does somebody know in which tables does the version history of the tables are being stored.??? Thanks in advance, C|
-
Preventing Copy & Paste from Within A Browser
I created a pdf from a WORD document. I then opened the PDF and went to ADVANCED>>SECURITY. I applied sucurity principles that would not allow copy and paste as well as printing and I locked it with a password (the document can be opened without a
-
Hi experts, I have a problem with moving average. The workround as follows, Step 1. 2008.12.31 open inventory itemA 10 q 20 usd moving average is 20 usd Step 2. 2009.01.10 Good Receipt PO itemA 10q 30 usd moving average is 25