Writing a query using a multisource-enabled data foundation
I know there is an easy way to do this but Iu2019m suffering from a mind block late on a Friday afternoon.
What I have is a multisource-enabled data foundation that is reading data from a connection to multi-cube MC07 in our DEV system and multi-cube MC07 in our QAS system. The two multi-cubes are joined in the data foundation on create date and contract number.
Here is what I have done so far:
- Created two relational multisource-enabled connections. One connections to multi-cube MC07 in DEV system and one to multi-cube MC07 in our QA system, one to a to DEV BW and
- Created a multi-source data foundation on the connections with joins on create date and contract number.
- Created a relational data source business layer using my data foundation
- In the business layer I click on the Queries tab and then the u201C+u201D to bring up the query panel
I want to write a query that combines the data from DEV and QA and list all the contract numbers, their create date and a gross quantity field from both systems
How do I write the query?
Appreciate any help
Mike...
Whenever we are creating DataFoundation with Multi source Enabled, the data sources are not enabled. For single source it is working fine. we are doing it with SAP BO4.0 SP4 Client tools. How to reslove this issue..
Similar Messages
-
SQL Query using a Variable in Data Flow Task
I have a Data Flow task that I created. THe source query is in this file "LPSreason.sql" and is stored in a shared drive such as
\\servername\scripts\LPSreason.sql
How can I use this .sql file as a SOURCE in my Data Flow task? I guess I can use SQL Command as Access Mode. But not sure how to do that?Hi Desigal59,
You can use a Flat File Source adapter to get the query statement from the .sql file. When creating the Flat File Connection Manager, set the Row delimiter to a character that won’t be in the SQL statement such as “Vertical Bar {|}”. In this way, the Flat
File Source outputs only one row with one column. If necessary, you can set the data type of the column from DT_STR to DT_TEXT so that the Flat File Source can handle SQL statement which has more than 8000 characters.
After that, connect the Flat File Source to a Recordset Destination, so that we store the column to a SSIS object variable (supposing the variable name is varQuery).
In the Control Flow, we can use one of the following two methods to pass the value of the Object type variable varQuery to a String type variable QueryStr which can be used in an OLE DB Source directly.
Method 1: via Script Task
Add a Script Task under the Data Flow Task and connect them.
Add User::varQuery as ReadOnlyVariables, User::QueryStr as ReadWriteVariables
Edit the script as follows:
public void Main()
// TODO: Add your code here
System.Data.OleDb.OleDbDataAdapter da = new System.Data.OleDb.OleDbDataAdapter();
DataTable dt = new DataTable();
da.Fill(dt, Dts.Variables["User::varQuery"].Value);
Dts.Variables["QueryStr2"].Value = dt.Rows[0].ItemArray[0];
Dts.TaskResult = (int)ScriptResults.Success;
4. Add another Data Folw Task under the Script Task, and join them. In the Data Flow Task, add an OLE DB Source, set its Data access mode to “SQL command from variable”, and select the variable User::QueryStr.
Method 2: via Foreach Loop Container
Add a Foreach Loop Container under the Data Flow Task, and join them.
Set the enumerator of the Foreach Loop Container to Foreach ADO Enumerator, and select the ADO object source variable as User::varQuery.
In the Variable Mappings tab, map the collection value of the Script Task to User::QueryStr, and Index to 0.
Inside the Foreach Loop Container, add a Data Flow Task like step 4 in method 1.
Regards,
Mike Yin
TechNet Community Support -
I'm using the LabView Database Connectivity Toolset and am using the following query...
UPDATE IndexStation
SET Signal_Size=200
WHERE 'StartTime=12:05:23'
Now the problem is that this command seems to update all rows in the table IndexStation... Not just specifically the row where StartTime=12:05:23
I have tries all sorts of {} [] / ' " around certain characters and column names but it always seems to update all rows...
I've begun to use the SQL query tab in Access to try and narrow down as to why this happens, but no luck!
Any ideas!?
Thanks,
Chris.Chris Walter wrote:
I completely agree about the Microsoft issue.
But it seems no SQL based manual states that { } will provide a Date/Time constant.
Is this an NI only implementation? Because I can't seem to get it to function correctly within LabView or in any SQL query.
Chris.
There is nothing about the database toolkit in terms of SQL syntax that would be NI specific. The database Toolkit simply interfaces to MS ADO/DAO and the actual SQL syntax is usually implemented in the database driver or database itself although I wouldn't be surprised if ADO/DAO does at times munch a bit with that too.
The Database Toolkit definitely does not. So this might be a documentation error indeed. My understanding of SQL syntax is in fact rather limited so not sure which databases might use what delimiters to format date/time values. I know that SQL Server is rather tricky thanks to MS catering for the local date/time format in all their tools and the so called universal date/time format has borked on me on several occasions.
Rolf Kalbermatter
Rolf Kalbermatter
CIT Engineering Netherlands
a division of Test & Measurement Solutions -
HR Query using two diffferent master data attribute values
I have a requirement to calculate headount based on certain condition.
Example: The condition is as follows:
For each "Performace Rating" (ROW)
Count Headcount for "Below MIn" (COLUMN)
- Below Min = "Annual Salary" less than "Pay Grade Level MIn"
- "Annual Salary" and "Pay Grade Level" are attributes of "Employee" Master data and is of type CURR
-"Pay Grade Level Min" in turn is an attribute of "Pay Grade Level" master data.
I am trying to build this query based on a standard Infoset 0ECM_IS02 that has an ODS (Compensation Process), Employee and Person" .
Any help in the required approach is appreciated like creating a restricetd KF using ROWCOUNT (Number of Records) and implement the logic for "Below Min"
Thanks
RajHi
Have you tried to create a variable for ths kf with exit. I thnk it is possible here
Assign points if useful
Regards
N Ganesh -
Another way of writing a query using IN...
Hi ,
Is there another way to write the following query?
select DISTINCT TEST_DESCR_F
FROM TEST_CATALOG
where test_code in (select exetasi from parag_aktin
where barcode='090320061')
Thanks , a lot
SimonIt is a bit unfair to compare the run time statistics of queries that only you can run. If you want people to be able to make suggestions that are more efficient you should
A) State that is your requirement.
B) Post table create scripts and sample data.
Having said that you can try this
select distinct test_descr_f
from test_catalog a, parag_aktin b
where a.test_code = b.exetasi
and b.barcode = '090320061'It is certainly different, whether it is more efficient only you can know. -
Bug in 10.2.0.5 for query using index on trunc(date)
Hi,
We recently upgraded from 10.2.0.3 to 10.2.0.5 (enterprise edition, 64bit linux). This resulted in some strange behaviour, which I think could be the result of a bug.
In 10.2.0.5, after running the script below the final select statement will give different results for TRUNC(b) and TRUNC (b + 0). Running the same script in 10.2.0.3 the select statement returns correct results.
BTW: it is related to index usage, because skipping the "CREATE INDEX" statement leads to correct results for the select.
Can somebody please confirm this bug?
Regards,
Henk Enting
-- test script:
DROP TABLE test_table;
CREATE TABLE test_table(a integer, b date);
CREATE INDEX test_trunc_ind ON test_table(trunc(b));
BEGIN
FOR i IN 1..100 LOOP
INSERT INTO test_table(a,b) VALUES(i, sysdate -i);
END LOOP;
END;
SELECT *
FROM (
SELECT DISTINCT
trunc(b)
, trunc(b + 0)
FROM test_table
Results on 10.2.0.3:
TRUNC(B) TRUNC(B+0)
05-08-2010 05-08-2010
04-08-2010 04-08-2010
01-08-2010 01-08-2010
30-07-2010 30-07-2010
28-07-2010 28-07-2010
27-07-2010 27-07-2010
23-07-2010 23-07-2010
22-07-2010 22-07-2010
17-07-2010 17-07-2010
03-07-2010 03-07-2010
26-06-2010 26-06-2010
etc.
Results on 10.2.0.5:
TRUNC(B) TRUNC(B+0)
04-05-2010 03-08-2010
04-05-2010 31-07-2010
04-05-2010 24-07-2010
04-05-2010 06-07-2010
04-05-2010 05-07-2010
04-05-2010 01-07-2010
04-05-2010 16-06-2010
04-05-2010 14-06-2010
04-05-2010 08-06-2010
04-05-2010 07-06-2010
04-05-2010 30-05-2010
etc.Thanks four your reply.
I already looked at the metalink doc. It lists 4 bugs introduced in 10.2.0.5, but none of them seems related to my problem. Did I overlook something?
Regards,
Henk -
HELP! Query Using View to denormalize data
Help!!!
Below are two logical records of data that have been denormalized so that each column is represented as a different record in the database
RECORD NUMBER, COL POSITION, VALUE
1, 1, John
1, 2, Doe
1, 3, 123 Nowhere Lake
1, 4, Tallahassee
1, 5, FL
2, 1, Mary
2, 2, Jane
2, 3, 21 Shelby Lane
2, 4, Indianapolis
2, 5, IN
I need to write a view to return the data values in this format:
John, Doe, 123 Nowhere Lake, Tallahassee, FL
Mary, Jane, 21 Shelby Lane, Indianapolis, IN
I REALLY need this as soon as possible! Someone PLEASE come to my rescue!!!Assuming that the other table has one record per record_num in the first table, then you could do something like:
SQL> SELECT * FROM t1;
RECORD_NUM DATE_COL
1 17-MAR-05
2 16-MAR-05
SQL> SELECT a.record_num, col1, col2, col3, col4, col5, t1.date_col
2 FROM (SELECT record_num,
3 MAX(DECODE(col_position, 1, value)) Col1,
4 MAX(DECODE(col_position, 2, value)) Col2,
5 MAX(DECODE(col_position, 3, value)) Col3,
6 MAX(DECODE(col_position, 4, value)) Col4,
7 MAX(DECODE(col_position, 5, value)) Col5
8 FROM t
9 GROUP BY record_num) a, t1
10 WHERE a.record_num = t1.record_num;
RECORD_NUM COL1 COL2 COL3 COL4 COL5 DATE_COL
1 John Doe 123 Nowhere Lake Tallahassee FL 17-MAR-05
2 Mary Jane 21 Shelby Lane Indianapolis IN 16-MAR-05If your second table is structured like the first table then something more like:
SELECT record_num,
MAX(DECODE(source,'Tab1',DECODE(col_position, 1, value))) Col1,
MAX(DECODE(source,'Tab1',DECODE(col_position, 2, value))) Col2,
MAX(DECODE(source,'Tab1',DECODE(col_position, 3, value))) Col3,
MAX(DECODE(source,'Tab1',DECODE(col_position, 4, value))) Col4,
MAX(DECODE(source,'Tab1',DECODE(col_position, 5, value))) Col5,
MAX(DECODE(source,'Tab2',DECODE(col_position, 1, value))) T2_Col1,
MAX(DECODE(source,'Tab2',DECODE(col_position, 2, value))) T2_Col2
FROM (SELECT 'Tab1' source, record_num, col_position, value
FROM t
UNION ALL
SELECT 'Tab2' source, record_num, col_position, value
FROM t1)
GROUP BY record_numBy the way, I can't say I am enamoured of your data model.
John -
In DBI , how to find out the Source Query used for the Report
Hi All,
How to find out the Source Query used to display the data in the DBI Reports or Dashboards. We can get it in Apps Front end by Going to Help and Record Histroty. But DBI Runs in Internet Explorer so i dont know how to get the source query ( SELECT Query ) Used.
In IE we have View --> Source . But that does not help since it gives the HTML Coding and not the SELECT Query used.
If anyone has ever worked on it...Please help me in finding it.
Thanks,
Neeraj ShrivastavaHi neeraj,
You can see the query used to display reports.Follow these steps to get the query.
1)Login to oracle apps
2)Select "Daily Business Intelligence Administrator" responsiblity.
3)Now click on "Enable/Disable Debugging" (Now u enabled debugging)
4)now open the report which you want to see the query of
5)In view source it displays query along with the bind varilables.
Feel free to ping me if you have any doubts
thanks
kittu -
List of Values in Data foundation Layer Bi 4.0 Issues?
I have an issue surrounding list of values’ created in the data foundation layer created using SQL.
The LOV can be associated with an object within the business layer. The business layer is then saved and closed, however on re-opening the association with the LOV is now displaying an error.
We are currently on patch 15 of sp2, are you aware of any issues using LOV from the data foundation layer.This has now been fixed in sp4.
-
How to use Add Query Criteria for the MySQL data Base in Netbeans ?
How to use Add Query Criteria for the MySQL data Base in Netbeans Visual web pack.
When the Query Criteria is add like
SELECT ALL counselors.counselors_id, counselors.first_name, counselors.telephone,counselors.email
FROM counselors WHERE counselors.counselors_id = ?
when i run this Query in the Query Window
i get a error message Box saying
Query Processing Error Parameter metadata not available for the given statement
if i run the Query with out Query Criteria its working fine.*I am glad I am not the only one who have this problem. Part of issue has been described as above, there are something more in my case.
Whenever I try to call ****_tabRowSet.setObject(1, userDropList.getSeleted()); I got error message as shown below:*
The Java codes are:
public void dropDown1_processValueChange(ValueChangeEvent event) {
Object s = this.dropDown1.getSelected();
try {
this.User_tabDataProvider1.setCursorRow(this.User_tabDataProvider1.findFirst("User_Tab.User_ID", s));
this.getSessionBean1().getTrip_tabRowSet1().setObject(1, s);
this.Trip_tabDataProvider1.refresh();
} catch (Exception e) {
this.log("Error: ", e);
this.error("Error: Cannot select user"+e.getMessage());
SQL statement for Trip_tabRowSet:
SELECT ALL Trip_Tab.Trip_Date,
Trip_Tab.User_ID,
Trip_Tab.Destination
FROM Trip_Tab
WHERE Trip_Tab.User_ID = ?
Error messages are shown below:
phase(RENDER_RESPONSE 6,com.sun.faces.context.FacesContextImpl@5abf3f) threw exception: com.sun.rave.web.ui.appbase.ApplicationException: java.sql.SQLException: No value specified for parameter 1 java.sql.SQLException: No value specified for parameter 1
com.sun.rave.web.ui.appbase.faces.ViewHandlerImpl.cleanup(ViewHandlerImpl.java:559)
com.sun.rave.web.ui.appbase.faces.ViewHandlerImpl.afterPhase(ViewHandlerImpl.java:435)
com.sun.faces.lifecycle.LifecycleImpl.phase(LifecycleImpl.java:274)
com.sun.faces.lifecycle.LifecycleImpl.render(LifecycleImpl.java:140)
javax.faces.webapp.FacesServlet.service(FacesServlet.java:245)
org.apache.catalina.core.ApplicationFilterChain.servletService(ApplicationFilterChain.java:397)
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:184)
com.sun.webui.jsf.util.UploadFilter.doFilter(UploadFilter.java:240)
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:216)
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:184)
org.netbeans.modules.web.monitor.server.MonitorFilter.doFilter(MonitorFilter.java:368)
org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:216)
org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:184)
org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:276)
org.apache.catalina.core.StandardPipeline.doInvoke(StandardPipeline.java:566)
org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:536)
org.apache.catalina.core.StandardContextValve.invokeInternal(StandardContextValve.java:240)
org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:179)
org.apache.catalina.core.StandardPipeline.doInvoke(StandardPipeline.java:566)
com.sun.enterprise.web.WebPipeline.invoke(WebPipeline.java:73)
org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:182)
org.apache.catalina.core.StandardPipeline.doInvoke(StandardPipeline.java:566)
com.sun.enterprise.web.VirtualServerPipeline.invoke(VirtualServerPipeline.java:120)
org.apache.catalina.core.ContainerBase.invoke(ContainerBase.java:939)
org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:137)
org.apache.catalina.core.StandardPipeline.doInvoke(StandardPipeline.java:566)
org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:536)
org.apache.catalina.core.ContainerBase.invoke(ContainerBase.java:939)
org.apache.coyote.tomcat5.CoyoteAdapter.service(CoyoteAdapter.java:239)
com.sun.enterprise.web.connector.grizzly.ProcessorTask.invokeAdapter(ProcessorTask.java:667)
com.sun.enterprise.web.connector.grizzly.ProcessorTask.processNonBlocked(ProcessorTask.java:574)
com.sun.enterprise.web.connector.grizzly.ProcessorTask.process(ProcessorTask.java:844)
com.sun.enterprise.web.connector.grizzly.ReadTask.executeProcessorTask(ReadTask.java:287)
com.sun.enterprise.web.connector.grizzly.ReadTask.doTask(ReadTask.java:212)
com.sun.enterprise.web.connector.grizzly.TaskBase.run(TaskBase.java:252)
com.sun.enterprise.web.connector.grizzly.WorkerThread.run(WorkerThread.java:75)
tandardWrapperValve[Faces Servlet]: Servlet.service() for servlet Faces Servlet threw exception
java.sql.SQLException: No value specified for parameter 1
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:910)
at com.mysql.jdbc.PreparedStatement.fillSendPacket(PreparedStatement.java:1674)
at com.mysql.jdbc.PreparedStatement.fillSendPacket(PreparedStatement.java:1622)
at com.mysql.jdbc.PreparedStatement.executeQuery(PreparedStatement.java:1332)
at com.sun.sql.rowset.internal.CachedRowSetXReader.readData(CachedRowSetXReader.java:193)
at com.sun.sql.rowset.CachedRowSetXImpl.execute(CachedRowSetXImpl.java:979)
at com.sun.sql.rowset.CachedRowSetXImpl.execute(CachedRowSetXImpl.java:1439)
at com.sun.data.provider.impl.CachedRowSetDataProvider.checkExecute(CachedRowSetDataProvider.java:1274)
at com.sun.data.provider.impl.CachedRowSetDataProvider.setCursorRow(CachedRowSetDataProvider.java:335)
at com.sun.data.provider.impl.CachedRowSetDataProvider.setCursorIndex(CachedRowSetDataProvider.java:306)
at com.sun.data.provider.impl.CachedRowSetDataProvider.getRowCount(CachedRowSetDataProvider.java:639)
at com.sun.webui.jsf.component.TableRowGroup.getRowKeys(TableRowGroup.java:1236)
at com.sun.webui.jsf.component.TableRowGroup.getFilteredRowKeys(TableRowGroup.java:820)
at com.sun.webui.jsf.component.TableRowGroup.getRowCount(TableRowGroup.java:1179)
at com.sun.webui.jsf.component.Table.getRowCount(Table.java:831)
at com.sun.webui.jsf.renderkit.html.TableRenderer.renderTitle(TableRenderer.java:420)
at com.sun.webui.jsf.renderkit.html.TableRenderer.encodeBegin(TableRenderer.java:143)
at javax.faces.component.UIComponentBase.encodeBegin(UIComponentBase.java:810)
at com.sun.webui.jsf.component.Table.encodeBegin(Table.java:1280)
at javax.faces.component.UIComponent.encodeAll(UIComponent.java:881)
at javax.faces.component.UIComponent.encodeAll(UIComponent.java:889)
at javax.faces.component.UIComponent.encodeAll(UIComponent.java:889)
at javax.faces.component.UIComponent.encodeAll(UIComponent.java:889)
at javax.faces.component.UIComponent.encodeAll(UIComponent.java:889)
at javax.faces.component.UIComponent.encodeAll(UIComponent.java:889)
at com.sun.faces.application.ViewHandlerImpl.doRenderView(ViewHandlerImpl.java:271)
at com.sun.faces.application.ViewHandlerImpl.renderView(ViewHandlerImpl.java:182)
at com.sun.rave.web.ui.appbase.faces.ViewHandlerImpl.renderView(ViewHandlerImpl.java:285)
at com.sun.faces.lifecycle.RenderResponsePhase.execute(RenderResponsePhase.java:133)
at com.sun.faces.lifecycle.LifecycleImpl.phase(LifecycleImpl.java:244)
at com.sun.faces.lifecycle.LifecycleImpl.render(LifecycleImpl.java:140)
at javax.faces.webapp.FacesServlet.service(FacesServlet.java:245)
at org.apache.catalina.core.ApplicationFilterChain.servletService(ApplicationFilterChain.java:397)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:303)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:184)
at com.sun.webui.jsf.util.UploadFilter.doFilter(UploadFilter.java:240)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:216)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:184)
at org.netbeans.modules.web.monitor.server.MonitorFilter.doFilter(MonitorFilter.java:368)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:216)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:184)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:276)
at org.apache.catalina.core.StandardPipeline.doInvoke(StandardPipeline.java:566)
at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:536)
at org.apache.catalina.core.StandardContextValve.invokeInternal(StandardContextValve.java:240)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:179)
at org.apache.catalina.core.StandardPipeline.doInvoke(StandardPipeline.java:566)
at com.sun.enterprise.web.WebPipeline.invoke(WebPipeline.java:73)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:182)
at org.apache.catalina.core.StandardPipeline.doInvoke(StandardPipeline.java:566)
at com.sun.enterprise.web.VirtualServerPipeline.invoke(VirtualServerPipeline.java:120)
at org.apache.catalina.core.ContainerBase.invoke(ContainerBase.java:939)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:137)
at org.apache.catalina.core.StandardPipeline.doInvoke(StandardPipeline.java:566)
at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:536)
at org.apache.catalina.core.ContainerBase.invoke(ContainerBase.java:939)
at org.apache.coyote.tomcat5.CoyoteAdapter.service(CoyoteAdapter.java:239)
at com.sun.enterprise.web.connector.grizzly.ProcessorTask.invokeAdapter(ProcessorTask.java:667)
at com.sun.enterprise.web.connector.grizzly.ProcessorTask.processNonBlocked(ProcessorTask.java:574)
at com.sun.enterprise.web.connector.grizzly.ProcessorTask.process(ProcessorTask.java:844)
at com.sun.enterprise.web.connector.grizzly.ReadTask.executeProcessorTask(ReadTask.java:287)
at com.sun.enterprise.web.connector.grizzly.ReadTask.doTask(ReadTask.java:212)
at com.sun.enterprise.web.connector.grizzly.TaskBase.run(TaskBase.java:252)
at com.sun.enterprise.web.connector.grizzly.WorkerThread.run(WorkerThread.java:75)
Also when I tried to update my MYSQL connector / J driver to version 5.1.5 from 5.0.5 (NB 5.5.1) and 5.0.7 (NB 6.1), I could not get it work (looooong time to search some JDBC classes and with no response in the end) on both of my Netbean 5.5.1(on PC) and Netbean 6.1(on laptop) IDEs.
Could anybody look into this issue.
Many thanks
Edited by: linqing on Nov 22, 2007 4:48 AM -
Writing the file using Write to SGL and reading the data using Read from SGL
Hello Sir, I have a problem using the Write to SGL VI. When I am trying to write the captured data using DAQ board to a SGL file, I am unable to store the data as desired. There might be some problem with the VI which I am using to write the data to SGL file. I am not able to figure out the minor problem I am facing. I am attaching a zip file which contains five files.
1) Acquire_Current_Binary_Exp.vi -> This is the VI which I used to store my data using Write to SGL file.
2) Retrive_BINARY_Data.vi -> This is the VI which I used to Read from SGL file and plot it
3) Binary_Capture -> This is the captured data using (1) which can be plotted using (2) and what I observed is the plot is different and also the time scare is not as expected.
4) Unexpected_Graph.png is the unexpected graph when I am using Write to SGL and Read from SGL to store and retrieve the data.
5) Expected_Graph.png -> This is the expected data format I supposed to get. I have obtained this plot when I have used write to LVM and read from LVM file to store and retrieve the data.
I tried a lot modifying the sub VI’s but it doesn’t work for me. What I think is I am doing some mistake while I am writing the data to SGL and Reading the data from SGL. Also, I don’t know the reason why my graph is not like (5) rather I am getting something like its in (4). Its totally different. You can also observe the difference between the time scale of (4) and (5).
Attachments:
Krishna_Files.zip 552 KBThe binary data file has no time axis information, it is pure y data. Only the LVM file contains information about t(0) and dt. Since you throw away this information before saving to the binary file, it cannot be retrieved.
Did you try wiring a 2 as suggested?
(see also http://forums.ni.com/ni/board/message?board.id=BreakPoint&message.id=925 )
Message Edited by altenbach on 07-29-2005 11:35 PM
LabVIEW Champion . Do more with less code and in less time .
Attachments:
Retrive_BINARY_DataMOD2.vi 1982 KB -
Create Query using data from 0IC_C03 to show stock balance
Dear All,
I have successfully setup 0IC_C03. I have loaded in the data in accordance with the documentation. I check the result and it seems the stock balance match with the data source.
However, my question is how can I setup a formula in the query so I can look up stock balance based on the date I put in to run the query.
For example, I loaded in the data on Oct 13, 2008. For material A, I have a key figure called Quantity total stock which is the net of receipt and issue total stock is 10,000 LB as of Oct 13, 2008.
When I run the same query using the date as of Oct 31, 2005, the quantity total stock show -10,000 LB. The stock balance of material should be 0 LB on Oct 31, 2005. I know I can't just simply create 2 key figures - one using the loading date (Oct 13, 2008) and one using the input variable date to add up together since I will have trouble if I enter Oct 13, 2008 as the input variable day because it will zero out.
Please advise. Thanks.Hi Brian,
Thank you for your reply. After I double check, the setting was wrong in the DTP of 2LIS_03_BX to feed the infocube 0IC_C03. it wasn't set to initial Non-cumulative for non-cumlative value in the extraction mode. after I change that and re-ran the whole process, the current and past stock balances are correct.
However. the problem happen again when I test delta update. I loaded 2LIS_03_BF delta update first to 0IC_C03 and compress with no marker update was checked. after that, I loaded 2LIS_03_UM delta update and compress with no no marker update was checked. For some reasons, the quantity total stock doesn't add up. it should be 9600 LB in total quantity stock but show 0 LB as of Oct 08. When I ran the query for Oct 06, the quantity total stock show -9600 LB. do you know why? Thanks again. -
Query using 0calmonth, to input the month range, data incorrect
hi, experts,
I have a query using 0CALMONTH in the filter, restricted the range value ( eg. 2009.01-2009.08 ),
but the data in Bex Analyzer only 2009.08, doest not collect the cube data from 2009.01 to 2009.08.
If restrict the single value (eg, 2009.02), the data is correct.
I tried to restrict the key figure by 0CALMONTH and then tried using variables with an offset but it does not work as input is a month range.
my bw support package13, basis sp12.
thanks in advance,
xwu.this issue solved by myself, there is a non-cumulative Key Figure.
thanks,
xwu. -
I am not able to set my Personal Hotspot setting, if I try to set it massage displayed "To enable Personal Hotspot for this account, contact carrier " I am in Oman and using Nawras service for data plan. Plz help me. Before I was using this service but now facing problem.
Md Asad wrote:
Yes but they told mobile co mean Device 'iPhone co'
Sorry but that makes no sense in English. Only your mobile phone company (i.e. "carrier") can enable the Personal Hotspot feature. -
After installing an app on iPhone 5s, I can't use it outside of wifi (using cellular or mobile data) and the app is not listed in settings to enable its use of mobile data - any clues on what is wrong here?
Steps to take to resolve this issue:
1/ Turn off Wifi
2/ Ensure Mobile (cellular) data is on
3/ Turn iPhone OFF and then ON (re-boot iPhone)
4/ Without turning Wifi on, open an app that requires mobile (cellular) data.
5/ The app will now show on the Settings list with the option to enable or disable mobile (cellular) data
6/ In time all apps that require or use mobile data will be added to the list automatically.
Maybe you are looking for
-
Order of values passed to ODCIAggregateIterate
I have implemented my own aggregate function that concatenates varchar2 values together. It works perfectly, however, the order of the aggregated data is inconsistent and does not match the order that it occurs in the table the aggregation is over (
-
Battery issue and warranty question
I bought my macbook pro in Dec 2007 and really love it. Fancy interface, easy usage. And after almost a year in Nov 2008, my battery went bad. Only last for 45 minutes, I check my serial and it said I'm still under warranty so I went to the nearest a
-
My computer does not start download osx mountain lion what should I do, My computer does not start download osx mountain lion what should I do
-
Drop Shadow Around Edge of Website
I am trying to make a drop shadow all around the edge of a site in Dreamweaver CS4. I know I need to make a container around the wrapper and place it as a background image and repeat y. I know how to make the graphic in PS. What I can't figure out is
-
Diskfilter writes error when installing GRUB on a raid
I'm setting up a server running Arch with two raided drives: cat /proc/mdstat md127 : active raid1 sda[0] sdb[1] 976631488 blocks super 1.2 [2/2] [UU] bitmap"1/8 pages [4KB], 65536KB chunk have the following partition table created with cgdisk: gdisk