Issue in Date with in Two Records.
Hi Friends,
I have table and having two columns EFFECTIVE_START_DATE and EFFECTIVE_END_DATE.
Problem is i have some records of empooyee row wise and two consequent records are having gap which cant be traceout.
E_CODE EFFECTIVE_START_DATE EFFECTIVE_END_DATE PREVIOUS_COE
10620 12-Jul-10 CORPORATE
10620 17-Sep-09 30-Nov-09 CORPORATE
10620 19-Nov-07 31-Jan-08 CORPORATE
I want to create a view which will give me all the record same as above table except Effective_Start_date and Effective_End_date. Sample records are mentioned in the below table.
Fact is i dont have actual record in table as mentioned in below table but i have to get this recods forcefully in the view.i have to take care of EFFECTIVE_END_DATE must be -1 the next EFFECTIVE_START_DATE
E_CODE EFFECTIVE_START_DATE EFFECTIVE_END_DATE PREVIOUS_COE
10620 1-Dec-10 CORPORATE
10620 1-Feb-09 30-Nov-09 CORPORATE
10620 19-Nov-07 31-Jan-08 CORPORATE
Please let me know is it possible to approach this one or i have to think other way.
Thanks
Bachan.
hi Giri,
You did't got my point
This is the record i am having in a table
E_CODE EFFECTIVE_START_DATE EFFECTIVE_END_DATE PREVIOUS_COE
Row1 10620 12-Jul-10 ----------------------------------------------------------------CORPORATE
Row2 10620 17-Sep-09 --------------------------30-Nov-09 ---------------------CORPORATE
Row3 10620 19-Nov-07 --------------------------31-Jan-08 ----------------------CORPORATE
You can check if you will see from row 3 u can see start date is 19-Nov-07 and end date is 31-Jan-08
moving forward in row 2, start date is 17-Sep-09. but it suppose to be 01-Feb-08 that is next day of row3 end date.
In Row2 if end date is 30-Nov-09 then in the row1 start date should be 01-Dec-09
That's why i have requested i need a view in this way.
E_CODE EFFECTIVE_START_DATE EFFECTIVE_END_DATE PREVIOUS_COE
10620 1-Dec-10 --------------------------------------------------------------------CORPORATE
10620 1-Feb-09 --------------------------------30-Nov-09 ---------------------CORPORATE
10620 19-Nov-07 ------------------------------31-Jan-08 ----------------------CORPORATE
Hope you understand now.
Thanks
Bachan
Similar Messages
-
Inventory Issue: Movement type 905 creating two records in BI.
Hello Experts,
We have a problem in Inventory scenario implemented in our system.After the month load completion in Snap shot Cube,we are getting quantity mismatch between BI and ECC values.I found out that there are records exist with movement type 905 creating problem.
we have only one record in ECC while there are two records exist in BI with one have reversal indicator on it and other does not have reversal indicator.Qty is same in both of the case ,also the record which have reversal indicator have Process key 100 while the record with out reversal indicator doesn't have any process key.And in further update from ODs ZMMDS104 to ICSNAP1 ,in update rule we are adding or subtracting on the basis of process key as per standard Inventory scenario implementation.
So am i right that this Movement type is creating problem ..?
Does any one faces this issue earlier...?
and if this is issue then how to resolve this ?
Please advice as soon as possible as it becomes a high priority issue in client space.
Thanks in Advance
Regards,
Yash Raj.Hi Aduri,
Thanks for the quick reply.
I ma new to inventory field,can you please guide me how to check your advice details in the system like you
said "Recheck the Reversal Indicated Qty because if it has reverse indicator it should come from ECC or from ODS Check the Same at Change Log Table to trace the Changes."
What do you mean by "or from ODS" do you mean that ODS could create this entry ..? also this ODs ZMMDS104 is got daily loaded by delta from 2lis_03_bf . but this record is related to period 12/2008.and now we are in period 1/2009 (as i am trying to reconciling the data for last period) so i Guess there is no chance to find out this entry in change log.
Also "Try to see at DS field for the Process Key/Movement type with Reverse Indicator At ECC. "
How to check the same in data source?
Also "If you don't have the Reverse postings at ECC without process Key then you can report on the same to the Functional Team/Client. But before that make sure that you have reverse postings without process key assigned to it."
I checked in the ECC system(RSA3) There is only one record exist in for that material document it has positive quantity and material movement type 905 with movement indicator L.
Waiting for your response.
Thanks once again
Regards,
Yash Raj -
11g Cube not showing any data with no Rejected records
Hi David ,
Strangely one of my 11g Cube is not showing data from today as it is showing all records rejected . However when I lookup Rejected records table I didn't find any records .Not sure what is happening . When I want to peep out the AWM queries from CUBE_BUILD_LOG and ran to the database in the AWM schema the records are retruning perfectly fine . I wonder same query is firing during Cube load but without any data ? My Cube build script has only LOAD and Aggregate .
after maintain My dimensions data are looking fine but no data populated after Cube maintenance . My MV switch off across all dimensions and cubes .
I navigate to CUBE_OPERATION_LOG and not able to comprehend about the content.
Any advice ?
Thanks and Regards,
DxPHi David ,
To be very frank today is very bad day ... Please see below my observation:
Executed below to make sure that no key values in dimension is missing but present in fact . All below query returns no row.
select distinct owner_postn_wid from w_synm_rx_t_f
minus
select distinct row_wid from postn_dh
select distinct payer_type_Wid from w_synm_rx_t_f
minus
select distinct row_wid from wc_ins_plan_dh
select distinct market_wid from w_synm_rx_t_f
minus
select distinct row_wid from w_product_dh
select distinct period_day_wid from w_synm_rx_t_f
minus
select distinct row_wid from w_daytime_D
select distinct contact_wid from w_synm_rx_t_f
intersect
select distinct row_wid from w_person_d
select distinct X_TERR_TYPE_WID from w_synm_rx_t_f
minus
select distinct row_wid from W_LOV_D
============================
Below returns count of 0 rows : ensure no NULL present
select count(1) from w_synm_rx_t_f where contact_wid is null;
select count(1) from w_synm_rx_t_f where owner_postn_wid is null;
select count(1) from w_synm_rx_t_f where payer_type_Wid is null;
select count(1) from w_synm_rx_t_f where period_day_wid is null;
select count(1) from w_synm_rx_t_f where X_TERR_TYPE_WID is null;
select count(1) from w_synm_rx_t_f where market_wid is null;
+++++++++++++++++++++++++++++++++
Cube Build Log has below entry:
796 0 STARTED CLEAR VALUES MKT_SLS_CUBE CUBE NNOLAP NN_OLAP_POC P14:JAN2010 17-AUG-11 07.12.08.267000000 PM +05:30 JAVA 1 C 47141 67 0 1
796 0 COMPLETED CLEAR VALUES MKT_SLS_CUBE CUBE NNOLAP NN_OLAP_POC P14:JAN2010 17-AUG-11 07.12.08.267000000 PM +05:30 JAVA 1 C 47141 67 0 2
796 0 STARTED LOAD MKT_SLS_CUBE CUBE NNOLAP NN_OLAP_POC P14:JAN2010 17-AUG-11 07.12.08.283000000 PM +05:30 JAVA 1 C 47142 68 0 1
796 0 SQL LOAD MKT_SLS_CUBE CUBE "<SQL>
<![CDATA[
SELECT /*+ bypass_recursive_check cursor_sharing_exact no_expand no_rewrite */
T16_ROW_WID ALIAS_127,
T13_ROW_WID ALIAS_128,
T10_ROW_WID ALIAS_129,
T7_ROW_WID ALIAS_130,
T4_ROW_WID ALIAS_131,
T1_ROW_WID ALIAS_132,
SUM(T20_MKT_TRX) ALIAS_133,
SUM(T20_MKT_NRX) ALIAS_134
FROM
SELECT /*+ no_rewrite */
T1."CONTACT_WID" T20_CONTACT_WID,
T1."MARKET_WID" T20_MARKET_WID,
T1."OWNER_POSTN_WID" T20_OWNER_POSTN_WID,
T1."PAYER_TYPE_WID" T20_PAYER_TYPE_WID,
T1."PERIOD_DAY_WID" T20_PERIOD_DAY_WID,
T1."MKT_NRX" T20_MKT_NRX,
T1."MKT_TRX" T20_MKT_TRX,
T1."X_TERR_TYPE_WID" T20_X_TERR_TYPE_WID
FROM
NN_OLAP_POC."W_SYNM_RX_T_F" T1 )
T20,
SELECT /*+ no_rewrite */
T1."ROW_WID" T16_ROW_WID
FROM
NN_OLAP_POC."W_DAYTIME_D" T1 )
T16,
SELECT /*+ no_rewrite */
T1."ROW_WID" T13_ROW_WID
FROM
NN_OLAP_POC."W_PERSON_D" T1 )
T13,
SELECT /*+ no_rewrite */
T1."ROW_WID" T10_ROW_WID
FROM
NN_OLAP_POC."WC_INS_PLAN_DH" T1 )
T10,
SELECT /*+ no_rewrite */
T1."ROW_WID" T7_ROW_WID
FROM
NN_OLAP_POC."W_LOV_D" T1 )
T7,
SELECT /*+ no_rewrite */
T1."ROW_WID" T4_ROW_WID
FROM
NN_OLAP_POC."POSTN_DH" T1 )
T4,
SELECT /*+ no_rewrite */
T1."ROW_WID" T1_ROW_WID
FROM
NN_OLAP_POC."W_PRODUCT_DH" T1 )
T1
WHERE
((T20_PERIOD_DAY_WID = T16_ROW_WID)
AND (T16_ROW_WID = 20100101)
AND (T20_CONTACT_WID = T13_ROW_WID)
AND (T20_PAYER_TYPE_WID = T10_ROW_WID)
AND (T7_ROW_WID = T20_X_TERR_TYPE_WID)
AND (T20_OWNER_POSTN_WID = T4_ROW_WID)
AND (T20_MARKET_WID = T1_ROW_WID)
AND ((T20_PERIOD_DAY_WID) IN ((20100107.000000) , (20100106.000000) , (20100128.000000) , (20100124.000000) , (20100121.000000) , (20100118.000000) , (20100115.000000) , (20100109.000000) , (20100125.000000) , (20100114.000000) , (20100111.000000) , (20100110.000000) , (20100104.000000) , (20100101.000000) , (20100129.000000) , (20100123.000000) , (20100117.000000) , (20100113.000000) , (20100108.000000) , (20100131.000000) , (20100120.000000) , (20100116.000000) , (20100119.000000) , (20100105.000000) , (20100102.000000) ,
(20100130.000000) , (20100127.000000) , (20100122.000000) , (20100112.000000) , (20100103.000000) , (20100126.000000) ) ) )
GROUP BY
(T1_ROW_WID, T4_ROW_WID, T7_ROW_WID, T10_ROW_WID, T13_ROW_WID, T16_ROW_WID)
ORDER BY
T1_ROW_WID ASC NULLS LAST ,
T4_ROW_WID ASC NULLS LAST ,
T7_ROW_WID ASC NULLS LAST ,
T10_ROW_WID ASC NULLS LAST ,
T13_ROW_WID ASC NULLS LAST ,
T16_ROW_WID ASC NULLS LAST ]]>/>
</SQL>" NNOLAP NN_OLAP_POC P14:JAN2010 17-AUG-11 07.12.08.627000000 PM +05:30 JAVA 1 MAP1 C 47142 68 0 2
796 0 COMPLETED LOAD MKT_SLS_CUBE CUBE "<CubeLoad
LOADED="0"
REJECTED="4148617"/>" NNOLAP NN_OLAP_POC P14:JAN2010 17-AUG-11 07.12.40.486000000 PM +05:30 JAVA 1 C 47142 68 0 3
796 0 STARTED UPDATE MKT_SLS_CUBE CUBE NNOLAP NN_OLAP_POC P14:JAN2010 17-AUG-11 07.12.40.501000000 PM +05:30 JAVA 1 C 47143 69 0 1
796 0 COMPLETED UPDATE MKT_SLS_CUBE CUBE NNOLAP NN_OLAP_POC P14:JAN2010 17-AUG-11 07.12.40.548000000 PM +05:30 JAVA 1 C 47143 69 0 2
+++++++++++++++++
You can observer clear rejection of 4 million rows ... Ran the above query which returns my data successfully.
Look out to CUBE_REJECTED records take the sample record and put into the above query it is returning the data fine with my measures and dimension WID's :(PLEASE SEE BELOW THE FILTERS on ROW_WID)
=========================
SELECT /*+ bypass_recursive_check cursor_sharing_exact no_expand no_rewrite */
T16_ROW_WID ALIAS_127,
T13_ROW_WID ALIAS_128,
T10_ROW_WID ALIAS_129,
T7_ROW_WID ALIAS_130,
T4_ROW_WID ALIAS_131,
T1_ROW_WID ALIAS_132,
SUM(T20_MKT_TRX) ALIAS_133,
SUM(T20_MKT_NRX) ALIAS_134
FROM
SELECT /*+ no_rewrite */
T1."CONTACT_WID" T20_CONTACT_WID,
T1."MARKET_WID" T20_MARKET_WID,
T1."OWNER_POSTN_WID" T20_OWNER_POSTN_WID,
T1."PAYER_TYPE_WID" T20_PAYER_TYPE_WID,
T1."PERIOD_DAY_WID" T20_PERIOD_DAY_WID,
T1."MKT_NRX" T20_MKT_NRX,
T1."MKT_TRX" T20_MKT_TRX,
T1."X_TERR_TYPE_WID" T20_X_TERR_TYPE_WID
FROM
NN_OLAP_POC."W_SYNM_RX_T_F" T1 )
T20,
SELECT /*+ no_rewrite */
T1."ROW_WID" T16_ROW_WID
FROM
NN_OLAP_POC."W_DAYTIME_D" T1 )
T16,
SELECT /*+ no_rewrite */
T1."ROW_WID" T13_ROW_WID
FROM
NN_OLAP_POC."W_PERSON_D" T1 )
T13,
SELECT /*+ no_rewrite */
T1."ROW_WID" T10_ROW_WID
FROM
NN_OLAP_POC."WC_INS_PLAN_DH" T1 )
T10,
SELECT /*+ no_rewrite */
T1."ROW_WID" T7_ROW_WID
FROM
NN_OLAP_POC."W_LOV_D" T1 )
T7,
SELECT /*+ no_rewrite */
T1."ROW_WID" T4_ROW_WID
FROM
NN_OLAP_POC."POSTN_DH" T1 )
T4,
SELECT /*+ no_rewrite */
T1."ROW_WID" T1_ROW_WID
FROM
NN_OLAP_POC."W_PRODUCT_DH" T1 )
T1
WHERE
((T20_PERIOD_DAY_WID = T16_ROW_WID)
AND (T16_ROW_WID = 20100101)
AND (T20_CONTACT_WID = T13_ROW_WID)
AND (T20_PAYER_TYPE_WID = T10_ROW_WID)
AND (T7_ROW_WID = T20_X_TERR_TYPE_WID)
AND (T20_OWNER_POSTN_WID = T4_ROW_WID)
AND (T20_MARKET_WID = T1_ROW_WID)
AND T13_ROW_WID = 255811
AND T7_ROW_WID = 122
AND T4_ROW_WID =3
AND T1_ROW_WID=230
AND T10_ROW_WID = 26
AND ((T20_PERIOD_DAY_WID) IN ((20100107.000000) , (20100106.000000) , (20100128.000000) , (20100124.000000) , (20100121.000000) , (20100118.000000) , (20100115.000000) , (20100109.000000) , (20100125.000000) , (20100114.000000) , (20100111.000000) , (20100110.000000) , (20100104.000000) , (20100101.000000) , (20100129.000000) , (20100123.000000) , (20100117.000000) , (20100113.000000) , (20100108.000000) , (20100131.000000) , (20100120.000000) , (20100116.000000) , (20100119.000000) , (20100105.000000) , (20100102.000000) ,
(20100130.000000) , (20100127.000000) , (20100122.000000) , (20100112.000000) , (20100103.000000) , (20100126.000000) ) ) )
GROUP BY
(T1_ROW_WID, T4_ROW_WID, T7_ROW_WID, T10_ROW_WID, T13_ROW_WID, T16_ROW_WID)
ORDER BY
T1_ROW_WID ASC NULLS LAST ,
T4_ROW_WID ASC NULLS LAST ,
T7_ROW_WID ASC NULLS LAST ,
T10_ROW_WID ASC NULLS LAST ,
T13_ROW_WID ASC NULLS LAST ,
T16_ROW_WID ASC NULLS LAST
=================================
THE XML export of CUBE as below:
<!DOCTYPE Metadata [
<!ENTITY % BIND_VALUES PUBLIC "OLAP BIND VALUES" "OLAP METADATA">
%BIND_VALUES;
]>
<Metadata
Version="1.2"
MinimumDatabaseVersion="11.2.0.1">
<Cube
ETViewName="MKT_SLS_CUBE_VIEW"
Name="MKT_SLS_CUBE">
<Measure>
<BaseMeasure
SQLDataType="NUMBER"
ETMeasureColumnName="TRX"
Name="TRX">
<Description
Type="LongDescription"
Language="AMERICAN"
Value="TRX">
</Description>
<Description
Type="ShortDescription"
Language="AMERICAN"
Value="TRX">
</Description>
<Description
Type="Description"
Language="AMERICAN"
Value="TRX">
</Description>
</BaseMeasure>
</Measure>
<Measure>
<BaseMeasure
SQLDataType="NUMBER"
ETMeasureColumnName="NRX"
Name="NRX">
<Description
Type="LongDescription"
Language="AMERICAN"
Value="NRX">
</Description>
<Description
Type="ShortDescription"
Language="AMERICAN"
Value="NRX">
</Description>
<Description
Type="Description"
Language="AMERICAN"
Value="NRX">
</Description>
</BaseMeasure>
</Measure>
<CubeMap
Name="MAP1"
IsSolved="False"
Query="W_SYNM_RX_T_F"
WhereClause="W_DAYTIME_D.ROW_WID = 20100101">
<MeasureMap
Name="TRX"
Measure="TRX"
Expression="W_SYNM_RX_T_F.MKT_TRX">
</MeasureMap>
<MeasureMap
Name="NRX"
Measure="NRX"
Expression="W_SYNM_RX_T_F.MKT_NRX">
</MeasureMap>
<CubeDimensionalityMap
Name="TIME"
Dimensionality="TIME"
MappedDimension="TIME.CALENDER.MONTHLY"
JoinCondition="W_SYNM_RX_T_F.PERIOD_DAY_WID = W_DAYTIME_D.ROW_WID"
Expression="W_DAYTIME_D.ROW_WID">
</CubeDimensionalityMap>
<CubeDimensionalityMap
Name="CUSTOMER"
Dimensionality="CUSTOMER"
MappedDimension="CUSTOMER.CUSTOMER_HIERARCHY.DETAIL_LEVEL"
JoinCondition="W_SYNM_RX_T_F.CONTACT_WID = W_PERSON_D.ROW_WID"
Expression="W_PERSON_D.ROW_WID">
</CubeDimensionalityMap>
<CubeDimensionalityMap
Name="INS_PLAN_DH"
Dimensionality="INS_PLAN_DH"
MappedDimension="INS_PLAN_DH.INS_PLAN.DETAIL"
JoinCondition="W_SYNM_RX_T_F.PAYER_TYPE_WID = WC_INS_PLAN_DH.ROW_WID"
Expression="WC_INS_PLAN_DH.ROW_WID">
</CubeDimensionalityMap>
<CubeDimensionalityMap
Name="LIST_OF_VALUES"
Dimensionality="LIST_OF_VALUES"
MappedDimension="LIST_OF_VALUES.LOV_HIERARCHY.DETAIL_LEVEL"
JoinCondition="W_LOV_D.ROW_WID = W_SYNM_RX_T_F.X_TERR_TYPE_WID"
Expression="W_LOV_D.ROW_WID">
</CubeDimensionalityMap>
<CubeDimensionalityMap
Name="POSITIONDH"
Dimensionality="POSITIONDH"
MappedDimension="POSITIONDH.POST_HIER.DETAIL"
JoinCondition="W_SYNM_RX_T_F.OWNER_POSTN_WID = POSTN_DH.ROW_WID"
Expression="POSTN_DH.ROW_WID">
</CubeDimensionalityMap>
<CubeDimensionalityMap
Name="PRODH"
Dimensionality="PRODH"
MappedDimension="PRODH.PRODHIER.DETAILLVL"
JoinCondition="W_SYNM_RX_T_F.MARKET_WID = W_PRODUCT_DH.ROW_WID"
Expression="W_PRODUCT_DH.ROW_WID">
</CubeDimensionalityMap>
</CubeMap>
<Organization>
<AWCubeOrganization
MVOption="NONE"
SparseType="COMPRESSED"
MeasureStorage="SHARED"
NullStorage="MV_READY"
CubeStorageType="NUMBER"
PrecomputePercent="35"
PrecomputePercentTop="0"
PartitionLevel="TIME.CALENDER.MONTHLY"
AW="&AW_NAME;">
<SparseDimension
Name="TIME"/>
<SparseDimension
Name="CUSTOMER"/>
<SparseDimension
Name="INS_PLAN_DH"/>
<SparseDimension
Name="LIST_OF_VALUES"/>
<SparseDimension
Name="POSITIONDH"/>
<SparseDimension
Name="PRODH"/>
<DefaultBuild>
<![CDATA[BUILD SPEC LOAD_AND_AGGREGATE
LOAD NO SYNCH,
SOLVE
)]]>
</DefaultBuild>
</AWCubeOrganization>
</Organization>
<Dimensionality
Name="TIME"
ETKeyColumnName="TIME"
Dimension="TIME">
</Dimensionality>
<Dimensionality
Name="CUSTOMER"
ETKeyColumnName="CUSTOMER"
Dimension="CUSTOMER">
</Dimensionality>
<Dimensionality
Name="INS_PLAN_DH"
ETKeyColumnName="INS_PLAN_DH"
Dimension="INS_PLAN_DH">
</Dimensionality>
<Dimensionality
Name="LIST_OF_VALUES"
ETKeyColumnName="LIST_OF_VALUES"
Dimension="LIST_OF_VALUES">
</Dimensionality>
<Dimensionality
Name="POSITIONDH"
ETKeyColumnName="POSITIONDH"
Dimension="POSITIONDH">
</Dimensionality>
<Dimensionality
Name="PRODH"
ETKeyColumnName="PRODH"
Dimension="PRODH">
</Dimensionality>
<Description
Type="LongDescription"
Language="AMERICAN"
Value="MKT SLS CUBE">
</Description>
<Description
Type="ShortDescription"
Language="AMERICAN"
Value="MKT SLS CUBE">
</Description>
<Description
Type="Description"
Language="AMERICAN"
Value="MKT SLS CUBE">
</Description>
<ConsistentSolve>
<![CDATA[SOLVE
SUM
MAINTAIN COUNT
OVER ALL
)]]>
</ConsistentSolve>
</Cube>
</Metadata>
+++++++++++++++++++++++
I dropped the AW and create new from exported XML and maintain all dimensions and then rebuild . Still have the issue :(
Any thing you can hightlight from above ?
Thanks,
DxP
Also I sustpect whethere it is a issue due to below error caused when I click on one of my Position_Hier view from AWM : even if I select that view it is throwing the error in SQL developer after displaying first couple of rows (while page down)
java.sql.SQLException: ORA-33674: Data block size 63 exceeds the maximum size of 60 bytes.
at oracle.olap.awm.util.jdbc.SQLWrapper.execute(Unknown Source)
at oracle.olap.awm.querydialog.PagedQueryDialog$1.construct(Unknown Source)
at oracle.olap.awm.ui.SwingWorker$2.run(Unknown Source)
at java.lang.Thread.run(Thread.java:595)
Edited by: e_**** on Aug 17, 2011 8:41 PM -
How can I read and display data with OR without recording at the same time?
Hi all,
This forum has been of much help to me, but I have a rather specific question I can't seem to find an answer to. I am new to LabView, educating myself with tutorials, trial and error, and this forum.
Attached is what I have so far. I would like to read data (three voltages) and write it to an excel file. As written below, it will do nothing until I click the "Record" button, when it then takes data, displays it on the gauges and waveform chart, and writes it to a file. What I would like it to do is display the data at all times, and record when I click the record button (while still displaying data being read). I have tried to do this by moving the DAQ Assistant VI, gauges, and waveform graph outside of the loops, but when I do this, it will read until I press record, and then it simply records whatever value it's stuck on. Sorry for the long post, thanks for any help.
--Nathan
Attachments:
Record and Write to File 2.vi 332 KBSorry, I don't have any DAQ installed, so I won't comment on the DAQ parts.
First of all, Your VI will do nothing until you (1) first set the record button to ON and (2) press the start button in the tool bar. It almost seems that you are using the "continuous run" button. (Don't! That's not its purpose. Basically, it automatically restarts the program whenever it finishes). A toplevel VI should never stop during normal operation, so place a big while loop around all code and create a wait state that does nothing until you tell it to acquire.
You don't need the inner while loop, simply place your code inside the big while loop (and accumulate your array data in an initialized shift register if really needed. Currently you do at the inner loop boundary for some reason). You have a big problem that the array can grow without bounds unless you clear the accumulated data once in a while or place it in a FIFO buffer of limited size. If you let ot grow forever like you do now, the program might run out of resources at one point in the future.
Since you are appending to your file, maybe yo don't need any shift register, just use the data points of the current iteration. Place your save operations in a case structure and keep appending the new data to your file only if the case is true.
It is also very inefficient to use highlevel file IO here,because each instance opens and closes the file. I recommend to open the file once outside the loop, then use low level IO to append, and close the file once the big loop is done.
LabVIEW Champion . Do more with less code and in less time . -
Issue with Merging two files in BPM
Hi,
I need to merge two files (Balance and transaction) with correlation is defined from ID, Date and Accountnumber..
Sometimes, when there are no transaction records, then balanace file will come up number "0"
Balance file:
MDk;1728;175;02.09.11;781961.09;0.00;0.00;781961.09;;;;;;;;;0
MDk;8574;175;02.09.11;4462;1112;104098800;104102150;;;;;;;;;2
from the above file, two accounts..
MDk;1728; --- with zero transaction record
MDk;8574; --- with two transaction records
Transaction file:
MDk;8574;175;02.09.11;;DEBIT;;;;;-1112;;0;02.09.11;;;;20555;;;037;
MDk;8574;175;02.09.11;;CREDIT;;;;;104098800;;0;02.09.11;;;;;;;099;
We are using Correlation to merge the files by using the fields (MDk;8574;175;02.09.11)
Now the issue is the BPM is not working as the correlation is not matching as the balance file consists of a row (with zero transaction record) which is not present in transaction file.
I have to ignore the first record in balance file as it contains 0 transaction data which means there are no records for this account in transaction file.
How can I delete those records before going to merge condition? Is there any thing i can do in balance file adapter?
Any suggestions please?
Thanks
DeepthiHi Ramesh,
This is the problem at the first step of the BPM where we will receive the files.
Start> FORK (Rec1 & Rec 2)>TransforMap( Merge_to_targetfile )-->SendtoReceiver -->END
It is failing at step1 (Fork), where the files are not matching according to correlation condition which we set.
ie. ID, Date, Accountnumber.
As the transaction file doesn't contain the record "MDk;1728;175;02.09.11" which is present in Balance file, the correlation is not matching. Hence it is failed. It is not even reaching to map
As correlation is mandatory to receive two matching files, it is failing here.. -
Issue in Data from DSO to DSO Target with different Key
Hello All,
I am having Issue in Data from DSO to DSO Target with different Key
Source DSO has Employee + Wage + Post numner as key and Target has Employee + Wage Type as key.
DSO semantic grouping works like Group By clause in sql, is my understanding right ?
Also if someone can explain this with a small example, it would be great.
Many Thanks
KrishnaDear, as explained earlier your issue has nothing to do with semantic grouping .
Semantic grouping is only usefull when you have written a routine in the transformation for calculations and in error handling .
Please go through this blog which explains very clearly the use of semantic grouping .
http://scn.sap.com/community/data-warehousing/netweaver-bw/blog/2013/06/16/semantic-groups-in-dtp
Now coming to your above question
DSO 1
Employee WageTyp Amount
100 aa 200
200 aa 200
100 bb 400
100 cc 300
If we have semantic group as Employee . If we have Employee as key of the target DSO and update type as summation .
then target DSO will have
Emp Amount
100 700
200 200
In this case Wage type will be the last record arriving from the data package . If the record 100 cc 300 is arrivng last then wage type will be cc .
2) Case 2
DSO 1
Employee WageTyp Amount
100 aa 200
200 aa 200
100 bb 400
100 aa 300
if we do Semantic grouping with Emp and Wage Type If we have Employee and wage type as key of the target DSO and update type as summation .
then target DSO will have
Emp Wage Amount
100 aa 500
200 aa 200
100 bb 400
Hope this helps . -
How to keep data integrity with the two business service in OSB 10.3.1.0
How to keep data integrity with the two business service in OSB 10.3.1.0
In our customer system, customer want to keep data integerity between two businness service. I thinks this is XA transaction issue.
Basing customer requirment, I created a testcase but I can't keep data integerity, For detail information, please refer the attached docs.Can you please explain what you meant my data integrity in your use case?
Manoj -
FMS Recording - Seek Causes Synch Issues Audio/Data Recording
Hi,
I have recordings being generated by FMS 4.5. These recordings are a combination of data (for example, chat text that was typed into a chat window) and audio. The audio, as currently designed, comes on and off. Therefore, the record stream plays a audio stream, but that stream turns on and off by calling play(activeAudioStream) and play(false) once the audio has ended, over and over.
The recordings (as flv) play back fine, when not using the seek feature. When using the seek feature, I get obvious problems where playback will simply Stop, and not resume for a specified about of time. For example, if I start a recording at 60 seconds in, then at 120 seconds, playback stops and nothing happens for a long, seemingly random about of time. The stream is active, as when it does resume, the time value of the stream includes the long period of nothing happening.
This is very confusing, because the recording works fine from the beginning. I am fairly certain this has something to do with recording audio/video. A data only recording is fine. A recording with audio all the way through, even with data, is fine. But, having the recorded audio stream turn on and off seems to cause major problems. Any ideas? If the audio is not continuous, should I create data only and audio only recordings, and setup my playback to stream the audio based on data events indicating the start/end of an audio section?
Note this is being played back in an AS2 application.
Thanks.Are you using Stream.flush API - if you are using that it migth cause some syncronisation issues if data and audio/video is mixed.
I did not get your usecase clearly - let me know if you using above API. I think it would be better to use second option if you can usedata and audio seperately but probably you can explain use case better so that we migth help you. -
Two records getting inserted with the same timestamp...
hi all,
I am trying to submit a form . Now whenever I click submit before I insert any
data sent in that form I make a check (SELECT stmt. to see if that record can
be inserted ...few business validations..) and if the check is successful I then
proceed for the necessary insert into a table the Primary key for which is a running
Oracle sequence.
But if I click on the Submit button twice in close succession I have observed
may be 1 in 1000 attempts I am able to submit two records with the same time stamp
for the date of insertion . We are using Oracle 8 with weblogic 5.1. And I don't
think ORACLE's date precision is beyond seconds.
So any suggestion ..what is the best way to handle such things : one can be
to place the same business validation check just before the ending brace of the
method , or secondly sucmit the form thru javascript and don't submit it twice
even if the user clicks the submit button twice... any suggestion which u can
give .. are welcome.
thnx in advance
sajan
Is the pkey a timestamp or an Oracle sequence? The latter will always work,
since no two requests to a sequence can get the same value (rollover
excluded). If you must use timestamp, then you must auto-retry the insert
if the first attempt fails. Oracle does have higher precision than seconds,
but I can remember the exact precision ... I am pretty sure it works out to
at least two or three digits though.
Peace,
Cameron Purdy
Tangosol, Inc.
http://www.tangosol.com
Tangosol Server: Enabling enterprise application customization
"Sajan Parihar" <[email protected]> wrote in message
news:[email protected]...
>
> hi all,
> I am trying to submit a form . Now whenever I click submit before I
insert any
> data sent in that form I make a check (SELECT stmt. to see if that record
can
> be inserted ...few business validations..) and if the check is successful
I then
> proceed for the necessary insert into a table the Primary key for which is
a running
> Oracle sequence.
> But if I click on the Submit button twice in close succession I have
observed
> may be 1 in 1000 attempts I am able to submit two records with the same
time stamp
> for the date of insertion . We are using Oracle 8 with weblogic 5.1. And I
don't
> think ORACLE's date precision is beyond seconds.
> So any suggestion ..what is the best way to handle such things : one
can be
> to place the same business validation check just before the ending brace
of the
> method , or secondly sucmit the form thru javascript and don't submit it
twice
> even if the user clicks the submit button twice... any suggestion which u
can
> give .. are welcome.
>
> thnx in advance
> sajan
-
Problem with Purchase Info Record Condition Data ( ME11)
Hi all,
I am trying post Data for Purchase Info Records with Conditions data.
I am using a standard Batch Input Method ( LSMW).
Object 0060
Method 0000
Program Name RM06IBI0
Program Type B Batch Input
I am able to post data only for the condition type PB00. for other condition types FRC1 & FRA1 not able to post the data ( also session is not giving any error).
Therefore I am trying out witH BAPI option ( BAPI_PRICES_CONDITIONS),
when i try to test the BAPI i am getting an error "Record KNUMH = 0000668598,
VARNUMH = 000000000000000000000000000000: det. data missing ( Structue BAPICONDIT) .
i am passing below mentioned values to the structue BAPICONDIT
OPERATION 009
SD_DOC
ITM_NUMBER 000000
COND_NO 0000668598
COND_COUNT 02
APPLICATIO M
COND_TYPE FRC1
SCALETYPE F
SCALE_QTY 0.000
SCALE_VAL 1.000000000
CURRENCKEY USD
CURREN_ISO USD
CALCTYPCON C
COND_VALUE 2.000000000
CONDCURR USD
COND_ISO USD
COND_P_UNT 1
COND_UNIT KG
CONUNITISO KG
NUMCONVERT 1
DENOMINATO 1
BASE_UOM KG
BASEQTYISO KG
CONDITIDX 01
Please Suggest what else needs to be passed to get the BAPI working.
Regards,
SriramHi Phemmy,
it is possible to create PO before info record .info record establishes the relationship between vendor and your particular material.
to answer your 2nd question:
it is possible to create inforecord automatically during PO creation. in the ME2N screen, in the material tab, put the material code. then go to item details.in the materail data tab, you will get infoupdate check it.and then inforecord will be created automatically.
Thanks
Nisha
Edited by: nishaprasad on Jan 14, 2010 11:29 AM -
How to calculate any two date with diffence calculation by using obiee11g?
Hi,
i have a requirement like,
location wise current month and previous month with movement calculation,can to tell me how to calculate any two date with diffence calculation
by using obiee11g
Note,
I tried to implemented ago function as well as dynamic two dates calculation using $2-$1 methods..but i am getting the o/p it's self i am getiing some null value also that' why it's not tallying with our actual report.
i tired to used ifnull(mesaurecolumn,0) also case condition on the mesaure colution still it's not tallying.
THanks and Rds,
Devarasu.RHi,
for Date Difference........
TimestampDiff(interval, timestamp1, timestamp2)
ex:TimestampDiff(SQL_TSI_DAY, cast('1-apr-2011' as date), current_date)
Where:
interval
The specified interval. Valid values are: SQL_TSI_SECOND, SQL_TSI_MINUTE, SQL_TSI_HOUR, SQL_TSI_DAY,
SQL_TSI_WEEK, SQL_TSI_MONTH, SQL_TSI_QUARTER, SQL_TSI_YEAR.
Cheers,
Aravind -
Refresh page with data from the Next Record in the Table through a Button
Scenario: Record of a table “prototype” is made up of 8 columns,
key_col,
text_col,
label1_col, label2_col, label3_col,
check1_col, check2_col, check3_col,
I have created the following items on a page:
a) A Display Only item that is populated through a SQL query
“SELECT text_col from prototype where rownum=key_seq.NEXTVAL “.
b) Hidden item for the database columns “label1_col, label2_col, label3_col”
Source type for the hidden items is of type SQL query, Source expression is:
Select label1_col from prototype where rownum=key_seq.NEXTVAL ;
Select label2_col from prototype where rownum=key_seq.NEXTVAL ;
Select label3_col from prototype where rownum=key_seq.NEXTVAL ;
(key_seq is a sequence).
c) Checkbox item for the database columns “ check1_col, check2_col,check3_col"
d) The labels for the above checkbox items are &label1_col. , &label2_col. , &label3_col.
I have created a Save button to save the state of the checkboxes; (STATIC:;1 )
I want the page to be refreshed with the data from the next record (Fields text_col, label1_col, label2_col, label3_col) through a “ Next” Button.
Can I please know how I can achieve this?
Thanks in advanceIf you need the value that is entered in the textbox as the email body, then try this..
<html>
<HEAD>
<title>WebForm1</title>
<script language="javascript">
function mailHTML() {
var content=document.getElementById('textBox').value;
location.href="mailto:?body="+encodeURI(content);
</script>
</head>
<body>
<form name="theform" id="theform">
<div name="body1"/>
<input type="text" value="Test" id="textBox"/>
<input type="button" value="Send Email" onClick="mailHTML()"/>
</div>
</form>
</body>
</html> -
Regarding Message with No.of Records,date,timewith AM/PM
I generated the Idoc with future Dated Cost Centers
i want display the message with no.of records hold by the IDoc ande Date of Run: xx/xx/xxxx xx:xx AM/PM where the xs represent todays date/time, Number of Cost Centers Sent: x.
Tell me the procedure with ExampleWhile the action you describe is not normal for the iPhone, is this the only time that it has occurred or has it happened again? There are the normal user trobleshooting procedures that you can follow, power off/on, reset, restore, however, if it has not happened again, it sounds like an isolated incident that has not been duplicated.
Also, do you have a passcode lock on the phone to prevent others from accessing your data?
Message was edited by: ChrisJ4203 -
Deleting records based on date with timestamp
Hi there,
In continuation from my earlier Query. Multiple reccord have inadvertantly been inserted in the LOAN_TXN table. I want to delete all the records, except one, based on date with timestamp.
CREATE TABLE LOAN_TXN
TXN_DATE DATE,
BALANCE NUMBER(10,2),
CODE VARCHAR2(1),
INSERT INTO LOAN_TXN ( TXN_DATE, BALANCE, CODE) VALUES (TO_Date( '02/15/2010 11:00:00 AM', 'MM/DD/YYYY HH:MI:SS AM'), 250000, 'D');
INSERT INTO LOAN_TXN ( TXN_DATE, BALANCE, CODE) VALUES (TO_Date( '03/31/2010 11:59:59 AM', 'MM/DD/YYYY HH:MI:SS AM'), 250000, 'B');
INSERT INTO LOAN_TXN ( TXN_DATE, BALANCE, CODE) VALUES (TO_Date( '05/14/2010 11:25:00 AM', 'MM/DD/YYYY HH:MI:SS AM'), 500000, 'D');
INSERT INTO LOAN_TXN ( TXN_DATE, BALANCE, CODE) VALUES (TO_Date( '06/30/2010 12:15:00 PM', 'MM/DD/YYYY HH:MI:SS AM'), 4000, 'R');
INSERT INTO LOAN_TXN ( TXN_DATE, BALANCE, CODE) VALUES (TO_Date( '08/02/2010 10:45:26 AM', 'MM/DD/YYYY HH:MI:SS AM'), 4000, 'R');
INSERT INTO LOAN_TXN ( TXN_DATE, BALANCE, CODE) VALUES (TO_Date( '09/08/2010 02:10:17 PM', 'MM/DD/YYYY HH:MI:SS AM'), 4000, 'R');
INSERT INTO LOAN_TXN ( TXN_DATE, BALANCE, CODE) VALUES (TO_Date( '10/27/2010 04:25:20 PM', 'MM/DD/YYYY HH:MI:SS AM'), 4000, 'R');
INSERT INTO LOAN_TXN ( TXN_DATE, BALANCE, CODE) VALUES (TO_Date( '11/09/2010 10:15:55 AM', 'MM/DD/YYYY HH:MI:SS AM'), 4000, 'R');
INSERT INTO LOAN_TXN ( TXN_DATE, BALANCE, CODE) VALUES (TO_Date( '12/29/2010 03:10:20 PM', 'MM/DD/YYYY HH:MI:SS AM'), 4000, 'R');
INSERT INTO LOAN_TXN ( TXN_DATE, BALANCE, CODE) VALUES (TO_Date( '01/12/2011 01:11:15 PM', 'MM/DD/YYYY HH:MI:SS AM'), 4000, 'R');
INSERT INTO LOAN_TXN ( TXN_DATE, BALANCE, CODE) VALUES (TO_Date( '02/11/2011 12:11:48 PM', 'MM/DD/YYYY HH:MI:SS AM'), 4000, 'R');
INSERT INTO LOAN_TXN ( TXN_DATE, BALANCE, CODE) VALUES (TO_Date( '03/31/2011 11:59:59 PM', 'MM/DD/YYYY HH:MI:SS AM'), 4000, 'R');
INSERT INTO LOAN_TXN ( TXN_DATE, BALANCE, CODE) VALUES (TO_Date( '03/31/2011 11:59:59 PM', 'MM/DD/YYYY HH:MI:SS AM'), 4000, 'R');
INSERT INTO LOAN_TXN ( TXN_DATE, BALANCE, CODE) VALUES (TO_Date( '03/31/2011 11:59:59 PM', 'MM/DD/YYYY HH:MI:SS AM'), 4000, 'R');
INSERT INTO LOAN_TXN ( TXN_DATE, BALANCE, CODE) VALUES (TO_Date( '03/31/2011 11:00:00 AM', 'MM/DD/YYYY HH:MI:SS AM'), 4000, 'R');
INSERT INTO LOAN_TXN ( TXN_DATE, BALANCE, CODE) VALUES (TO_Date( '03/31/2011 11:59:59 AM', 'MM/DD/YYYY HH:MI:SS AM'), 460000, 'B');I want to delete mutiple records, except one, with code 'R' and TXN_DATE = '03/31/2011 11:59:59 PM' How do I do that ?user3308033 wrote:
I think your query would delete all the duplicate records. Keeping one, I just want to delete mutiple records with timestamp : 11:59:59.
DELETE FROM loan_txn
WHERE rowid != (
SELECT MIN(rowid)
FROM loan_txn
WHERE txn_date = TO_DATE('03/31/2011 11:59:59 PM','MM/DD/YYYY HH:MI:SS AM')
AND code = 'R'
AND txn_date = TO_DATE('03/31/2011 11:59:59 PM','MM/DD/YYYY HH:MI:SS AM')
AND code = 'R'
/SY. -
I use two PC's on a daily basis, both using 3.6.7. One does everything I would expect with regard to history, the other consistently loses history data, with no real pattern as to what it keeps and what it doesn't. Tried adjusting custom settings but nothing works.
oops! that should've been 3.6.8!
Maybe you are looking for
-
System performance slow - vanishing wallpaper
My G5 seems to be experiencing a performance slowdown but it passes any diagnostic test that I perform. Lately it has been misbehaving when asked to handle graphicly complex applications such as iPhoto and Photoshop. It also "forgets" my wallpaper an
-
Video purchase has disappeared from my iTunes library and from the store.
I have recently discovered that a video which I had purchased from the iTunes store about 2 years ago (Leonard Cohen's "Closing Time") has disappeared from my iTunes library. I did not delete it, and I have not had any computer issues which required
-
do anyone heard or know what is Dynamic Variable Header?
-
Running a subVI without interferin​g?
I have a main VI that runs in a continuos while loop, outside of that while loop I bring up a subVI to display data that is gathered in the main VI. The problem occurs in the sense that I want the subVI to come up at the beginning (no problem) and to
-
Private Reuse and Query views in HANA
When you create custom views (models) how would you classify as private, reuse and Query views? In SAP provided views they classified the views into Private, reuse and query views.. we would like to categorize our views in the same way. Take a look a