Extract data from a BW 7.0 cube to a SQL DB using Data Services XI
Hi Gurus,
We are trying to extract data from a BW 7.0 cube to a SQL DB using Data Services XI, the issue is that we can not read text without making "joins" between SID in the fact table and the master data tables. Do you know if it is posible to read text in a natural way?
Best Regards
Thanks Wondewossen,
As you know, the DataStores (Data Services) provide access to:
1.-Tables
2.-Functions
3.-IDOCs
4.-Open Hub Tables
We are trying to extract data using the first one (Tables), not using Open Hub.
Best Regardas
Similar Messages
-
Unable to load data from an ODS to a Cube
Hi all,
I am traying to load data from an ODS to a cube, and I'm getting the following msg at the bottom of the screen monitor: <b>"No corresponding requests were found for request REQU_BZUH9ZPYNL5XLQNB7K8IQDKFS in the ODS/Cube" Message no. RSBM043.</b>
I am unable to load the data. The QM status is yellow. When the process starts, on the left hand of the monitor it shows: "0 of 84408 records" which is ok. I load that with this package before and it works well.
Can you help me what I have to do? I tried to wait but it was the same: no progress and no data in the cube.
Thank you very much and kind regards,
MM.
May be this helps...
When I look at the status, it says that I have a short dump at BW. It was CALL_FUNCTION_REMOTE_ERROR and the short text is "The function module "SUBST_TRANSFER_UPGEVAL" cannot be used for 'remote'". This short dump occurs very much before I shoot the upload.
Thanks again.Hello MM,
Can you do the following..
make the Total status Red Delete the request from cube.
goto ODS -> Remove the Data Mart Status -> Try loading it again.
The error message that you get is common when we are trying to run a infopackage with the source object having no new request, (means all the requests available in the source are already moved to the targets). So please try the steps given above.
Hope it will help! -
Error while loading data from write optimized ODS to cube
Hi All,
I am loading data from a write optimized ODS to cube
I have done Generate Export Datasource
schedulled the info packge with 1 selection for full load
then it gave me following error in Transfer IDOCs & TRFC
Info IDOC 1: IDOC with errors added
Info IDOC 2: IDOC with errors added
Info IDOC 3: IDOC with errors added
Info IDOC 4: IDOC with errors added
Data packege 1 arrived in BW Processing : selected number does not agree with transferred number
Processing below is green
shows update of 4 new records to Datapackage 1.
Please provide inputs for the resolution
Thanks & Regards,
Rashmi.please let me know, What more details you need?
If I click F1 for error details i get following message
Messages from source system
see also Processing Steps Request
These messages are sent by IDoc from the source system. Both the extractor itself as well as the service API can send messages. When errors occur, several messages are usually sent together.
From the source system, there are several types of messages that can be differentiated by the so-called Info-IDoc-Status. The IDoc with status 2 plays a particular role here; it describes the number of records that have been extracted in a source system and sent to BI. The number of the records received in BI is checked against this information.
Thanks & Regards,
Rashmi. -
We are in the process of migrating from BPC7 SP12 Microsoft sql server to BPC10 netweaver on a sql server with BW7.4 and need to integrate our home grown data warehouse which is on a Microsoft sql server. The data warehouse currently connects to BPC7 using integration services/analysis services and runs mdx queries to analysis services to retrieve data from the BPC7 cube (view only). Please provide documentation on how to create this same integration with our data warehouse using BPC10 netweaver on a sql server.
When you were setting up your ODBC data source for
the Text driver, did you click on the "Options"
button? There's a lot of options in there that I
didn't understand at first glance.Yes I clicked on the options button, but the only thing there is dealing with file extensions. I left it set to the default.
I have since tried closing my connection after I insert a record, I then try executeQuery, but still no luck. Seems strange that I can write to the file but not read from it. If any thing I'd expect the opposite problem.
I have also tried using the class "JoltReport" from the sun tutorial instead of my own with the same result.
Message was edited by:
Hentay -
Hi all,
As we can get data from .xls, .xlsx file in a dataset. Now i want to read all the entire data from .xltm file.
I also want to get all sheets name from .xltm file. I am using .Net framework 4.5, Visual studio 2012.
Please help me. I am unable to extract it. It is urgent.Hello Kalpna Bindal,
I just investigate this issue and use the following way but it does work:
private void Button_Click(object sender, RoutedEventArgs e)
// initialize connection
OleDbConnection con = new OleDbConnection();
con.ConnectionString =
"Provider=Microsoft.ACE.OLEDB.12.0;Data Source=D:\\Fruits1.xltm;Extended Properties=\"Excel 12.0 Xml;HDR=YES;\"";
//pass query to Insert values into Excel
OleDbCommand cmd = new OleDbCommand();
cmd.CommandText = "insert into [Sheet1$] values('')";
cmd.Connection = con;
// EXECUTE QUERY
con.Open();
cmd.ExecuteNonQuery();
con.Close();
I then did a research about this issue and find the following case:
http://stackoverflow.com/questions/22225017/reading-excel-xltm-files-programmatically
To test whether it is possible I use the following code in WPF:
Microsoft.Office.Interop.Excel.Application xltmApp = new Microsoft.Office.Interop.Excel.Application();
xltmApp.Visible = false;
xltmApp.ScreenUpdating = false;
Workbook xltmBook = xltmApp.Workbooks.Open(@"D:\Fruits1.xltm");
Worksheet xlworksheet = xltmBook.Worksheets.get_Item(1);
for (int i = 1; i <= 1; i++)
for(int j=1;j<=1;j++)
string str = (string)(xlworksheet.UsedRange.Cells[i, j] as Microsoft.Office.Interop.Excel.Range).Value;
MessageBox.Show(str);
The result is that I can read this .xltm file and read the firstline without problem.
I would think you can also use this way to work with your .xltm file and output it to your application.
For more details about how to fill to your WPF, common wpf and excel thread will be helpful:
http://www.codearsenal.net/2012/06/c-sharp-read-excel-and-show-in-wpf.html#.URPyfPJJocU
http://www.dotnetcurry.com/showarticle.aspx?ID=988
http://mozouna.com/Blog/2013/03/28/wpf-reading-and-writing-to-excel-file-with-c/
By the way, for WPF case please post on WPF forum instead of C# forum.Best regards,
Barry
We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
Click
HERE to participate the survey. -
Data from ODS and a Inventory cube
Hi Guys,
We have an Inventory Cube zic_c03, and ODS which has some detailed information like batch level information.
There is a MultiProvider on top of these two.
My question is can we use a cube on top of the ODS and feed that into the Multi Provider.
Will that improve performance on queries.
What are the drawbacks or repercussions I need to understand first.
Thanks a lot in advance for your help.Hi Murali,
Yes, if you load into a cube from your ODS you can improve performance. I typically load data that can be summarized into the cube and cut out document numbers, line items, etc... If you need the detail, then just jump to a query off of the ODS objects' MP.
The drawbacks are that you no longer get all of your detailed information at the cube level.
Thanks,
Brian -
Error in uploading data from FIAP ODS to FIAP CUBE
We are trying to upload data from FIAP line item ODS to FIAP CUBE. However it is showing zero records updated for a long time. We tried deleting and repeating the same but again it is showing zero records only. The monitor status is yellow.
Following message is coming on the screen.
Missing message: Number of sent records
Regards
Renu GurumurthyHi Renu,
Find the below search link which helps u lot...
https://www.sdn.sap.com/irj/sdn/advancedsearch?cat=sdn_all&query=missingmessage%3ANumberofsent+records&adv=false&sortby=cm_rnd_rankvalue -
How to upload the data from existing ods to new cube?
Hi all,
There is an existing ods and reports are been created for that ods. My task is to create a cube and pull out the data from the ods to cube.
I have created a cube with update rule. And i want to know to how to load all the data from the existing ods to the newly created cube.
thanxs
harithaHi haritha,
If you've already created the update-rules, try to create cube, than make a mapping between ODS & Cube.
If done,
1. Right-Click on ODS, find the menu update data to data target.
2. Choose for Init.
If you wanna make it update data to cube automatically, then you must setup the setting in ODS.
1. double-click on it.
2. In setting, you'll see the option for "automatically update data to data target".
3. Choose for it.
Hopefully it can help you a lot.
Regards,
Niel.
thanks for any points you choose to assign. -
IP Exit Planning function copy data from cca to pca planning cubes
Hello All,
I have a requirement where I have to copy the characteristics and keyfigures of CCA plan cube data to pca plan cube data.The infoobjects in CCA aggregation level are are {0amount,0costcenter,0costelement,version,0calmonth,0infoprovider} which needs to be copied to corresponding infoobjects in PCA level {0amount,0profitcenter,0account,0version,0calmonth,0infoprovider}.
The CCA and PCA aggregation level are built on the top of the multiprovider.
I can do it using the fox coding but 0costelement cannot be mapped to 0account as these two are different fields.Since I have to copy the values of 0costelement to 0account , I was wondering how can I do it using the exit function.
As I have never used the exit function before, I was wondering if somebody can help me out with this.
By the way, I have read the forums and figured out to create a class in se24 and use interface
IF_RSPFLA_SRVTYPE_IMP_EXE and since I am generating some records , will be using the method IF_RSPLFA_SRVTYPE_IMP_EXEC~INIT_EXECUTE.
By the way , I read in the forums where there are methods/function modules which can copy data from one aggregation level to another aggregation level.Anyways, can you tell how can I loop thru the records of CCA aggregation level and copy the records to the PCA aggregation level.
Edited by: nazeer on Feb 22, 2009 12:04 PMThis thread might help you.
https://forums.sdn.sap.com/click.jspa?searchID=22634973&messageID=5317176 -
How to insert 300 data from associative array to backend table in PL/SQL
HI ALL,
I'm posting my code here:
Creating back end table:
Create table orlando
( id number(20),
calltype number(12),
gateway_name varchar2(25),
accounting_id varchar2(18),
start_time_system_ticks number(11),
node_time_zone varchar2(25),
start_date varchar2(10),
start_time varchar2(10),
softswitch_response number(11),
alerting number(11)
Creating package:
CREATE OR REPLACE PACKAGE r IS
type apollo_rec is record(
id number(20),
calltype number(12),
gateway_name varchar2(25),
accounting_id varchar2(18),
start_time_system_ticks number(11),
node_time_zone varchar2(25),
start_date varchar2(10),
start_time varchar2(10),
softswitch_response number(11),
alerting number(11)
TYPE bin_array IS TABLE OF apollo_rec INDEX BY BINARY_INTEGER;
PROCEDURE rr (state_array bin_array);
END ;
SET SERVEROUT ON
CREATE OR REPLACE PACKAGE BODY r IS
PROCEDURE rr (state_array bin_array) IS
BEGIN
FOR i IN 1 .. state_array.COUNT LOOP
INSERT INTO orlando(id,calltype,gateway_name,accounting_id,start_time_system_ticks)VALUES(state_array(i).id,state_array(i).calltype,state_array(i).gateway_name,
state_array(i).accounting_id,state_array(i).start_time_system_ticks);
COMMIT;
END LOOP;
END ;
END ;
I've run this code in i*SQL PLUS.But when I run this code for 5 entries there is no error but when I modify the insert statement for 300 entries(300 identifiers in the insert statement)
it gives me error:
Warning: Package Body created with compilation errors.
Errors for PACKAGE BODY R:
LINE/COL ERROR
7/2 PL/SQL: SQL Statement ignored
7/14 PL/SQL: ORA-00913: too many values
Is there any feature in PL/SQL to decrease the entries in insert statement and make the insert statement along with the program small and increase the program performance.
Edited by: 983040 on Jan 20, 2013 11:11 PMBasic example (ran on 11.2.0.3):
SQL> create table testtab( id number, day date, val varchar2(30) );
Table created.
SQL>
SQL> create or replace package TestTabLib as
2
3 type TTestTab is table of testtab%rowtype;
4
5 procedure InsertRows( rowArray TTestTab );
6
7 end;
8 /
Package created.
SQL>
SQL> create or replace package body TestTabLib as
2
3 procedure InsertRows( rowArray TTestTab ) is
4 begin
5 forall i in 1..rowArray.Count
6 insert into testtab values rowArray(i);
7 end;
8
9 end;
10 /
Package body created.
SQL>
SQL> declare
2 rowArray TestTabLib.TTestTab;
3 begin
4 --// populating the array - using a bulk fetch as
5 --// an example
6 select
7 object_id, created, object_name
8 bulk collect into
9 rowArray
10 from all_objects
11 where rownum < 11;
12
13 --// bulk insert array
14 TestTabLib.InsertRows( rowArray );
15 end;
16 /
PL/SQL procedure successfully completed.
SQL>
SQL> select * from testtab;
ID DAY VAL
100 2011/12/05 09:16:03 ORA$BASE
116 2011/12/05 09:16:04 DUAL
117 2011/12/05 09:16:04 DUAL
280 2011/12/05 09:19:09 MAP_OBJECT
365 2011/12/05 09:19:10 SYSTEM_PRIVILEGE_MAP
367 2011/12/05 09:19:10 SYSTEM_PRIVILEGE_MAP
368 2011/12/05 09:19:10 TABLE_PRIVILEGE_MAP
370 2011/12/05 09:19:11 TABLE_PRIVILEGE_MAP
371 2011/12/05 09:19:11 STMT_AUDIT_OPTION_MAP
373 2011/12/05 09:19:11 STMT_AUDIT_OPTION_MAP
10 rows selected.
SQL>
SQL> declare
2 rowArray TestTabLib.TTestTab;
3 begin
4 --// populating the array - using a custom build
5 --// loop example such as a Java front-end will
6 --// use reading data from user input form
7 rowArray := new TestTabLib.TTestTab();
8 rowArray.Extend(2); --// user entered 2 values
9 for i in 1..rowArray.Count loop
10 rowArray(i).id := i;
11 rowArray(i).day := trunc(sysdate);
12 rowArray(i).val := 'value '||to_char(i,'000');
13 end loop;
14
15 --// bulk insert array
16 TestTabLib.InsertRows( rowArray );
17 end;
18 /
PL/SQL procedure successfully completed.
SQL>
SQL> select * from testtab where val like 'value%';
ID DAY VAL
1 2013/01/21 00:00:00 value 001
2 2013/01/21 00:00:00 value 002
SQL> -
I just bought a Samsung 830 256GB solid state drive which I will use to replace my 15-inch, 2.53 GHz, Mid 2009 MacBook Pro's stock hard drive. I also keep an external hard drive in which I keep my regular Time Machine backups via FireWire. Would I be able to migrate all my data from it after putting in the solid state drive? I prefer not to use any third-party applications and I do not plan on replacing the SuperDrive with the hard drive that I'm going to remove from my laptop.
chinodelarosa wrote:
Would I be able to migrate all my data from it after putting in the solid state drive?
Yes. -
How to select data from 3rd row of Excel to insert into Sql server table using ssis
Hi,
Iam having Excel files with headers in first two rows , i want two skip that two rows and select data from 3rd row to insert into Sql Server table using ssis.3rd row is having column names.CUSTOMER DETAILS
REGION
COL1 COL2 COL3 COL4 COL5 COL6 COL7
COL8 COL9 COL10 COL11
1 XXX yyyy zzzz
2 XXX yyyy zzzzz
3 XXX yyyy zzzzz
4 XXX yyyy zzzzz
First two rows having cells merged and with headings in excel , i want two skip the first two rows and select the data from 3rd row and insert into sql server using ssis
Set range within Excel command as per below
See
http://www.joellipman.com/articles/microsoft/sql-server/ssis/646-ssis-skip-rows-in-excel-source-file.html
Please Mark This As Answer if it solved your issue
Please Mark This As Helpful if it helps to solve your issue
Visakh
My MSDN Page
My Personal Blog
My Facebook Page -
Post Author: vpost
CA Forum: Data Connectivity and SQL
I am trying to join information from two related web services within CR XI. I have successfully set up the web services as data sources have been able to get to the point where I get good data back. However, when I try to pull in certain fields, I get an error that says "Failed to retrieve data from the database/invalid argument provided". Here's the scenario:
The web services are structured as follows:Web Service 1 (Artist) has attributes of Artist Name and Date of Birth.Web Service 2 (CD) has attributes of CD Title and Release Date. Underneath each CD are songs, each of which have a Song Title and Artist Name.
I have defined both web services and defined a link between Artist.Artist Name and CD/Song.Artist Name. I am able to run a report with Song Title and Date of Birth that crosses web services. I am able to run another report with Song Title and CD Title that crosses the different levels in the second web service. However, if I add CD Title to the first report or Date of Birth to the second (both of which effectively force CR to employ the link between the two web services AND the CD/Song hierarchical structure in the second web service, I get the aforementioned error.
Any assistance understanding how multiple web services can be linked in this manner would be greatly appreciated.
Thanks in advance.Post Author: Mike Wright
CA Forum: Data Connectivity and SQL
Not sure about your exact situation, but having similar problem with another application and have tracked down to security. Added user to group Domain Admin and it works fine. It appears to be accessing a subdirectory which it does not have permission to use and then times out and returns the "invalid....". Seems that once the query just over a certain size (and I'm not sure what triggers this) it needs to make use of temparory file disk, intead of ram.
I'm still trying to track down which temporary it's trying to uses - so if you have any ques.
cheers -
Retrieving Data from SQLite and make link with the ID of the retrieved data?
Hello, first of all sorry if my english is bad as im from Mexico.
I using HTML/Javascript/SQLite, and im having an issue while retrieving data from SQLite, all my querys work right, but when i try to select an Id, the Name, Firstname, and Lastname and only use the Name, Firstname and Lastname to print on screen and the Id to use as a link or OnClick query with the id more info from the name of the person..
My SQLite query is the next one.
var sql = "SELECT [sis_persona].[idPersona], [sis_persona].[Nombre], [sis_persona].[Paterno], [sis_persona].[Materno] FROM [sis_persona] WHERE Nombre LIKE '"+ nombre1+"%' AND Paterno LIKE '"+ paterno1+"%' AND Materno LIKE '"+ materno1+"%' ORDER BY [sis_persona].[Nombre], [sis_persona].[Paterno], [sis_persona].[Materno] LIMIT 20";
And my ResultHandler is the next one:
row = document.createElement("tr");
cell = document.createElement("th");
cell.innerText = "Nombre";
row.appendChild(cell)
cell = document.createElement("th");
cell.innerText = "Paterno";
row.appendChild(cell)
cell = document.createElement("th");
cell.innerText = "Materno";
row.appendChild(cell)
tbl.appendChild(row);
var numRows = result.data.length;
for (var i = 0; i < numRows; i++)
// iterate over the columns in the result row object
row = document.createElement("tr");
for (col in result.data[i])
var ea = result.data[i];
cell = document.createElement("td");
var a = document.createElement( 'a' );
a.setAttribute('HREF','#');
var txt = document.createTextNode( ea.Nombre );
a.appendChild( txt );
a.onclick = function() {querytest(ea.idPersona);};
row.appendChild( a );
row.appendChild(cell);
cell = document.createElement("td");
var a = document.createElement( 'a' );
a.setAttribute('HREF','#');
var txt = document.createTextNode( ea.Paterno );
a.appendChild( txt );
a.onclick = function() {querytest(ea.idPersona);};
row.appendChild( a );
row.appendChild(cell);
cell = document.createElement("td");
var a = document.createElement( 'a' );
a.setAttribute('HREF','#');
var txt = document.createTextNode( ea.Materno );
a.appendChild( txt );
a.onclick = function() {querytest(ea.idPersona);};
row.appendChild( a );
row.appendChild(cell);
tbl.appendChild(row);
And this is where i have the problem, it print three times and i dont know how to take print only the result and make a link with the id from the query.
var ea = result.data[i];
cell = document.createElement("td");
var a = document.createElement( 'a' );
a.setAttribute('HREF','#');
var txt = document.createTextNode( ea.Nombre );
a.appendChild( txt );
a.onclick = function() {querytest(ea.idPersona);};
row.appendChild( a );
row.appendChild(cell);
Anyone have any ideas?
Sorry again if my english is bad.
Thanks in advanceHi Prashant,
Continuation to above my thread. There is a small correction.Directly in lsmw not possible. I uploaded using the same code of lsmw and made one report. Through that i uploaded the data directly selecting the file.
Regards,
Madhu. -
DLL data from a memory address not updating at correct rate when using DAQ
Hi,
I am using a USB DAQPad-6015 to acquire data from 8 analog channels and using an external clock (200Hz). I am also retrieving data from a library function node call which schedules a callback function to run at approximately 1000Hz. The function prototype accepts a pointer to an array, and the array contents are updated each time the callback function is run. Thus in labview I initialize an array and pass this as a parameter and am able to retrieve the data.
Using an event structure I am sampling the data from the library function node every time the DAQ takes a sample (so at 200Hz). I expect to see no repeated values from the dll data, but this isn't the case.
I am not sure why this happens. I thought possibly acquiring data from the analog channels takes priority over the scheduled callback function -- thus it might not be running at 1000Hz as I expect -- but I'm wondering if there could be any other explanation.
Any help would be much appreciated.
I've attached a simplified version of the VI and the output data.
Attachments:
multifunction_data.xls 139 KB
multchan1simp.vi 32 KBBut the data in the array is updated after the library function is first called. I think somehow LabVIEW is accessing this from a memory address. The attached excel file shows the results from the data, and the value is updated every couple of samples. I'm just confused why it isn't updated every sample.
I do have the source for the C dll though. I'll see if I can modifty it as you suggested.
The actual callback function is the following:
extern "C" __declspec(dllexport) HDCallbackCode HDCALLBACK updateDeviceCallback(void *pUserData)
hdBeginFrame(hdGetCurrentDevice());
/* Get the current location of the device (HD_GET_CURRENT_POSITION)
We declare a vector of three doubles since hdGetDoublev returns
the information in a vector of size 3. */
hdGetDoublev(HD_CURRENT_POSITION, gServoDeviceData.m_devicePosition);
hdGetDoublev(HD_CURRENT_JOINT_ANGLES, gServoDeviceData.m_deviceAngles);
DevDataArray[0] = gServoDeviceData.m_devicePosition[0];
DevDataArray[1] = gServoDeviceData.m_devicePosition[1];
DevDataArray[2] = gServoDeviceData.m_devicePosition[2];
DevDataArray[3] = gServoDeviceData.m_deviceAngles[0];
DevDataArray[4] = gServoDeviceData.m_deviceAngles[1];
DevDataArray[5] = gServoDeviceData.m_deviceAngles[2];
memcpy(pUserData, &DevDataArray, 6*sizeof(HDdouble));
hdEndFrame(hdGetCurrentDevice());
return HD_CALLBACK_CONTINUE;
Thanks for your reply!
Maybe you are looking for
-
Can't manually update shuffle on two computers.
I have my music on two computers. One is my Mac at home; the other is a Windows machine at work (all my own music). My computer at home was originally used to load some songs onto my Suffle. Sometimes I download a podcast at work and want to listen t
-
RMAN MONTHLY/WEEKLY/DAILY backups
Hi, Using RMAN how can I schedule MONTHLY/WEEKLY/DAILY backups. We are using TIVOLI for our backups. I want to keep the monthly/weekly backups in seperate tapes so that I can use later. Thanks in advance.
-
I just purchased a brand new desktop, athlon 2x dual core 260 with 4 gigs of ram. The computer just arrived yesterday in the mail and ever since first boot up no sounds play properly online (using IE and FireFox). The site doesnt matter, neither does
-
Whats crazy about my system is I did a complete driver update with the lenovo update and I still dont have the option for Automatically optimize for battery lifespan. It never shows in my options please someone help me to get this option so I wont bu
-
Table for FLoc and Order No.
Hi All, Is there any table where i can get the FLoc along with the Order No.? In AUFK there is no entry for field TPLNR. Thanks, AD.