Use by Date Items
Hi
We have several items with a u201Cuse byu201D date.
Is there any possibility to enter this information in the stock data, so that we know when the material will be out of date?
Thank you.
Regards.
MB
Hi Matthew ,
As per you scenario i must tell you that there is no such way to enter the information with stock , you can add information to the Item .
The only thing which is attached with the Stock and items are their Serial Numbers . so you can add some field there and add the ' use by date ' in that particular field .
By which you can manage the Item with your required infornation along with the serial numbers .
Thanks
Ashish
Similar Messages
-
How to use "discontinuation data" in BOM item ?
Hi everyone!
I would like to use "discontinuation data" in BOM item , not in MRP4 view, how to set? thanks!
Dandan>
dandan li wrote:
>
>
> I want to know the "discontinuation data" in BOM item, not Priority of AltItemGroup !
Dear Dandan Li,
Plz Re-check Mr. SmanS Reply in that thread
it is Regarding As per ur query
More over Plz try to encourage your repliers always
[committed quantity in production order;
Like wise there are many other threads of ur's
Plz Don't misunderstand
Regards
Madhu -
Using multiple data points in an item renderer?
I have a datagrid, bound to data from an XML file. My XML has
multiple items,such as:
<book>
<title></title>
<author></author>
<description></description>
<cover>someimage.jpg</cover>
</book>
I need to display each book as a unit, something like:
[image] Moby Dick
Herman Melville
A long book about a whale.
How would I accomplish this? Obviously, I can easily display
the discrete info in columns, but I'd like to lay it out as one
unit.
Thanks.Use a list control instead of a datagrid. Then use a custom
item renderer based on HBox or VBox depending on your preference
for horizontal or vertical layout... possibly even just a Box
control so you can change it at your whim. -
Essbase error msg while using Excel add-in: data item found before member
We're encountering an Essbase error message "data item found before member" on 2 of 4 cubes. Has anyone else encountered this msg, and does anyone have an idea how to resolve it. It occurs on multiple machines in development and testing environments.
As dougadams said, some members can be converted by Excel so that essbase no longer treats them as labels. Typically this includes numeric member names (i.e. accounts) and dates. I find the dates to be more of an issue than numeric member names, because it typically changes the date format as well. Either way, using a single quote as a prefix to the member name will tell Excel to treat it as a text entry, and hence a member/label to Essbase.The above is almost always the reason for the error you got. Of course, it's also possible that you really do have a numeric entry somewhere before your first 'data cell' in the template, especially if you did some manipulation of the template.
-
Using EL expression to define data item for a chart
I have a pie chart based on a ViewObject Data control. The pie chart is displaying the value of a predefined column, called Completed in the example below.
<graph IterBinding="ProjectsViewIterator" id="ProjectsView2"
xmlns="http://xmlns.oracle.com/adfm/dvt" type="PIE">
<graphDataMap leafOnly="true">
<series>
<item value="Material"/>
</series>
<groups>
<data>
<item value="Completed"/>
</data>
</groups>
</graphDataMap>
</graph>I would like to replace the constant name Completed with an EL expression, for example #{column.name}, where column is a request scope bean and name is a property. In other word, I would like to dynamically define, which column's value to display in the chart.
When trying this, on the page I get "No data to display", and the following exception:
java.lang.NullPointerException
at oracle.adf.model.dvt.binding.transform.Utils.validateDataType(Utils.java:512)
at oracle.adf.model.dvt.binding.transform.TransformRowIterator.validateDataType(TransformRowIterator.java:409)
at oracle.adf.model.dvt.binding.transform.TransformRowIterator.getCell(TransformRowIterator.java:263)
at oracle.dss.util.transform.DataTable.getCell(DataTable.java:577)
at oracle.dss.util.transform.DataTable.processCurrentRow(DataTable.java:602)
at oracle.dss.util.transform.DataTable.processProjection(DataTable.java:416)
at oracle.dss.util.transform.DataTable.<init>(DataTable.java:112)
at oracle.dss.util.transform.ResultTable.<init>(ResultTable.java:67)
...Is there any way to achieve this, i.e. dynamically define the data item name?
IstvanHi,
Is there anything new about this topic? Whether the new Jdeveloper has some improvements about the charts?
Zsolt -
Using FIFO as sql code for assigning indicator to data items
Hi All,
We are looking for a solution based on FIFO algorithm. We have a table having following data:
We need to perform FIFO on this table, and assign "object" as data items to other rows based on following conditions:
1. first we have to group the rows based on "object" column
2. then we have to traverse each group from the row having minimum start time of the object group e.g: row id 1 for object group for "19O"
2.1 Assign a "EqpREf" as "object" + <an integer value> , where integer value changes when the start and end chain finishes. Start -end chain is explained in step 2.2
2.2 then we have to pick the "nextstarttime" of the picked row and compare it against the closest "starttime" of the rows having "start" as same as "end" of the same row we are
picking for "nextstarttime" e.g: row id 2 of object 19O is having "nextstarttime" 0310 closest to "starttime" 0355 of row id 2 of object 19O , and rowid 2 is having "start" AAL which is similar to "end" of
rowid 1
2.3 We have to perform this chain till we find end of same and allocate each row in a chain same "Eqpref"
hence the output we need to generate will come as:
Kindly help on same.
Thanks in advance
-Regards
KumudHi,
Please find the following code block for what is input data and what should be output.
--The input data
create table temp_table
row_id int,
engine_no varchar(20),
schedule_no varchar(20),
start_station varchar(20),
end_station varchar(20),
startdate datetime,
enddate datetime,
starttime datetime,
endtime datetime,
record_id int,
engine_id int,
Mgt int,
nextstarttime datetime,
Schedule_ref varchar(20),
Engine_Ref varchar(20)
GO
insert into temp_table values(1,'19O','101','SGC','IXP','2015/01/01','2015/01/01','00:00:00','01:00:00',1,10,60,'02:00:00',null,null)
insert into temp_table values(2,'19O','102','SGC','IXP','2015/01/01','2015/01/01','00:30:00','01:30:00',2,10,60,'02:30:00',null,null)
insert into temp_table values(3,'19O','103','SGC','IXP','2015/01/01','2015/01/01','02:30:00','03:30:00',3,10,60,'04:30:00',null,null)
insert into temp_table values(4,'19O','104','IXP','DFW','2015/01/01','2015/01/01','03:30:00','04:00:00',4,10,60,'05:00:00',null,null)
insert into temp_table values(5,'19O','105','IXP','DFW','2015/01/01','2015/01/01','04:30:00','05:30:00',5,10,60,'06:30:00',null,null)
insert into temp_table values(6,'19O','106','DFW','ABC','2015/01/01','2015/01/01','05:00:00','06:00:00',6,10,60,'07:00:00',null,null)
insert into temp_table values(7,'19O','107','DFW','ABC','2015/01/01','2015/01/01','06:00:00','07:00:00',7,10,60,'08:00:00',null,null)
insert into temp_table values(8,'19O','108','DFW','ABC','2015/01/01','2015/01/01','07:00:00','08:00:00',8,10,60,'09:00:00',null,null)
insert into temp_table values(9,'19O','109','ABC','DEF','2015/01/01','2015/01/01','10:00:00','11:30:00',9,10,60,'12:30:00',null,null)
insert into temp_table values(10,'19O','110','XYZ','BDW','2015/01/01','2015/01/01','13:00:00','15:00:00',10,10,60,'16:00:00',null,null)
insert into temp_table values(1,'319','111','PQR','STU','2015/01/01','2015/01/01','00:00:00','01:00:00',11,11,60,'02:00:00',null,null)
insert into temp_table values(2,'319','211','PQR','STU','2015/01/01','2015/01/01','04:30:00','15:30:00',12,11,60,'16:30:00',null,null)
insert into temp_table values(3,'319','112','STU','ABC','2015/01/01','2015/01/01','05:00:00','06:00:00',13,11,60,'07:00:00',null,null)
insert into temp_table values(4,'319','212','STU','DEF','2015/01/01','2015/01/01','06:00:00','07:00:00',14,11,60,'08:00:00',null,null)
insert into temp_table values(5,'319','213','STU','PQR','2015/01/01','2015/01/01','07:00:00','08:00:00',15,11,60,'09:00:00',null,null)
insert into temp_table values(6,'319','118','STU','XYZ','2015/01/01','2015/01/01','10:00:00','11:30:00',16,11,60,'12:30:00',null,null)
insert into temp_table values(7,'319','119','DEF','JKL','2015/01/01','2015/01/01','13:00:00','15:00:00',17,11,60,'16:00:00',null,null)
insert into temp_table values(8,'319','215','ABC','MNO','2015/01/01','2015/01/01','17:00:00','20:00:00',18,11,60,'21:00:00',null,null)
insert into temp_table values(1,'19O','101','SGC','IXP','2015/01/01','2015/01/01','00:00:00','01:00:00',1,10,60,'02:00:00',null,null)
insert into temp_table values(2,'19O','102','SGC','IXP','2015/01/01','2015/01/01','00:30:00','01:30:00',2,10,60,'02:30:00',null,null)
insert into temp_table values(3,'19O','103','SGC','IXP','2015/01/01','2015/01/01','02:30:00','03:30:00',3,10,60,'04:30:00',null,null)
insert into temp_table values(4,'19O','104','IXP','DFW','2015/01/01','2015/01/01','03:30:00','04:00:00',4,10,60,'05:00:00',null,null)
insert into temp_table values(5,'19O','105','IXP','DFW','2015/01/01','2015/01/01','04:30:00','05:30:00',5,10,60,'06:30:00',null,null)
insert into temp_table values(6,'19O','106','DFW','ABC','2015/01/01','2015/01/01','05:00:00','06:00:00',6,10,60,'07:00:00',null,null)
insert into temp_table values(8,'19O','108','DFW','ABC','2015/01/01','2015/01/01','07:00:00','08:00:00',8,10,60,'09:00:00',null,null)
insert into temp_table values(9,'19O','109','ABC','DEF','2015/01/01','2015/01/01','10:00:00','11:30:00',9,10,60,'12:30:00',null,null)
insert into temp_table values(1,'319','111','PQR','STU','2015/01/01','2015/01/01','00:00:00','01:00:00',11,11,60,'02:00:00',null,null)
insert into temp_table values(2,'319','211','PQR','STU','2015/01/01','2015/01/01','04:30:00','15:30:00',12,11,60,'16:30:00',null,null)
insert into temp_table values(3,'319','112','STU','ABC','2015/01/01','2015/01/01','05:00:00','06:00:00',13,11,60,'07:00:00',null,null)
insert into temp_table values(4,'319','212','STU','DEF','2015/01/01','2015/01/01','06:00:00','07:00:00',14,11,60,'08:00:00',null,null)
insert into temp_table values(5,'319','213','STU','PQR','2015/01/01','2015/01/01','07:00:00','08:00:00',15,11,60,'09:00:00',null,null)
insert into temp_table values(6,'319','118','STU','XYZ','2015/01/01','2015/01/01','10:00:00','11:30:00',16,11,60,'12:30:00',null,null)
insert into temp_table values(7,'319','119','DEF','JKL','2015/01/01','2015/01/01','13:00:00','15:00:00',17,11,60,'16:00:00',null,null)
insert into temp_table values(8,'319','215','ABC','MNO','2015/01/01','2015/01/01','17:00:00','20:00:00',18,11,60,'21:00:00',null,null)
--output should come as the data in temp_table_final
create table temp_table_final
row_id int,
engine_no varchar(20),
schedule_no varchar(20),
start_station varchar(20),
end_station varchar(20),
startdate datetime,
enddate datetime,
starttime datetime,
endtime datetime,
record_id int,
engine_id int,
Mgt int,
nextstarttime datetime,
Schedule_ref varchar(20),
Engine_Ref varchar(20)
GO
insert into temp_table_final values(1,'19O','101','SGC','IXP','2015/01/01','2015/01/01','00:00:00','01:00:00',1,10,60,'02:00:00','101','19O-1')
insert into temp_table_final values(2,'19O','102','SGC','IXP','2015/01/01','2015/01/01','00:30:00','01:30:00',2,10,60,'02:30:00','102','19O-2')
insert into temp_table_final values(3,'19O','103','SGC','IXP','2015/01/01','2015/01/01','02:30:00','03:30:00',3,10,60,'04:30:00','103','19O-3')
insert into temp_table_final values(4,'19O','104','IXP','DFW','2015/01/01','2015/01/01','03:30:00','04:00:00',4,10,60,'05:00:00','101','19O-1')
insert into temp_table_final values(5,'19O','105','IXP','DFW','2015/01/01','2015/01/01','04:30:00','05:30:00',5,10,60,'06:30:00','102','19O-2')
insert into temp_table_final values(6,'19O','106','DFW','ABC','2015/01/01','2015/01/01','05:00:00','06:00:00',6,10,60,'07:00:00','104','19O-1')
insert into temp_table_final values(7,'19O','107','DFW','ABC','2015/01/01','2015/01/01','06:00:00','07:00:00',7,10,60,'08:00:00','107','19O-4')
insert into temp_table_final values(8,'19O','108','DFW','ABC','2015/01/01','2015/01/01','07:00:00','08:00:00',8,10,60,'09:00:00','105','19O-2')
insert into temp_table_final values(9,'19O','109','ABC','DEF','2015/01/01','2015/01/01','10:00:00','11:30:00',9,10,60,'12:30:00','106','19O-1')
insert into temp_table_final values(10,'19O','110','XYZ','BDW','2015/01/01','2015/01/01','13:00:00','15:00:00',10,10,60,'16:00:00','110','19O-5')
insert into temp_table_final values(1,'319','111','PQR','STU','2015/01/01','2015/01/01','00:00:00','01:00:00',11,11,60,'02:00:00','111','319-1')
insert into temp_table_final values(2,'319','211','PQR','STU','2015/01/01','2015/01/01','04:30:00','15:30:00',12,11,60,'16:30:00','211','319-2')
insert into temp_table_final values(3,'319','112','STU','ABC','2015/01/01','2015/01/01','05:00:00','06:00:00',13,11,60,'07:00:00','111','319-1')
insert into temp_table_final values(4,'319','212','STU','DEF','2015/01/01','2015/01/01','06:00:00','07:00:00',14,11,60,'08:00:00','212','319-3')
insert into temp_table_final values(5,'319','213','STU','PQR','2015/01/01','2015/01/01','07:00:00','08:00:00',15,11,60,'09:00:00','213','319-4')
insert into temp_table_final values(6,'319','118','STU','XYZ','2015/01/01','2015/01/01','10:00:00','11:30:00',16,11,60,'12:30:00','118','319-5')
insert into temp_table_final values(7,'319','119','DEF','JKL','2015/01/01','2015/01/01','13:00:00','15:00:00',17,11,60,'16:00:00','212','319-3')
insert into temp_table_final values(8,'319','215','ABC','MNO','2015/01/01','2015/01/01','17:00:00','20:00:00',18,11,60,'21:00:00','112','319-1')
insert into temp_table_final values(1,'19O','101','SGC','IXP','2015/01/01','2015/01/01','00:00:00','01:00:00',1,10,60,'02:00:00','101','19O-1')
insert into temp_table_final values(2,'19O','102','SGC','IXP','2015/01/01','2015/01/01','00:30:00','01:30:00',2,10,60,'02:30:00','102','19O-2')
insert into temp_table_final values(3,'19O','103','SGC','IXP','2015/01/01','2015/01/01','02:30:00','03:30:00',3,10,60,'04:30:00','103','19O-3')
insert into temp_table_final values(4,'19O','104','IXP','DFW','2015/01/01','2015/01/01','03:30:00','04:00:00',4,10,60,'05:00:00','101','19O-1')
insert into temp_table_final values(5,'19O','105','IXP','DFW','2015/01/01','2015/01/01','04:30:00','05:30:00',5,10,60,'06:30:00','102','19O-2')
insert into temp_table_final values(6,'19O','106','DFW','ABC','2015/01/01','2015/01/01','05:00:00','06:00:00',6,10,60,'07:00:00','104','19O-1')
insert into temp_table_final values(8,'19O','108','DFW','ABC','2015/01/01','2015/01/01','07:00:00','08:00:00',8,10,60,'09:00:00','105','19O-2')
insert into temp_table_final values(9,'19O','109','ABC','DEF','2015/01/01','2015/01/01','10:00:00','11:30:00',9,10,60,'12:30:00','106','19O-1')
insert into temp_table_final values(1,'319','111','PQR','STU','2015/01/01','2015/01/01','00:00:00','01:00:00',11,11,60,'02:00:00','111','319-1')
insert into temp_table_final values(2,'319','211','PQR','STU','2015/01/01','2015/01/01','04:30:00','15:30:00',12,11,60,'16:30:00','211','319-2')
insert into temp_table_final values(3,'319','112','STU','ABC','2015/01/01','2015/01/01','05:00:00','06:00:00',13,11,60,'07:00:00','111','319-1')
insert into temp_table_final values(4,'319','212','STU','DEF','2015/01/01','2015/01/01','06:00:00','07:00:00',14,11,60,'08:00:00','212','319-3')
insert into temp_table_final values(5,'319','213','STU','PQR','2015/01/01','2015/01/01','07:00:00','08:00:00',15,11,60,'09:00:00','213','319-4')
insert into temp_table_final values(6,'319','118','STU','XYZ','2015/01/01','2015/01/01','10:00:00','11:30:00',16,11,60,'12:30:00','118','319-5')
insert into temp_table_final values(7,'319','119','DEF','JKL','2015/01/01','2015/01/01','13:00:00','15:00:00',17,11,60,'16:00:00','212','319-3')
insert into temp_table_final values(8,'319','215','ABC','MNO','2015/01/01','2015/01/01','17:00:00','20:00:00',18,11,60,'21:00:00','112','319-1')
What we are doing here is generating a schedule for Trains departures.
here, we should identify the train schedules making a chain of stations considering the endstation of a train engine no should be startstation of another record for same engineno. also the starttime of engineno should be nearest of nextstarttime
of same station.
for example : if we pick Ist row "SGC-IXP", nextstarttime for same is "02:00:00 am". this means train departed from SGC will reach to IXP and is available for departure from IXP after "02:00:00 am". So we have to consider
the record having startstation as IXP and nearest starttime to nextstarttime ("02:00:00"). So the next train departure would be IXP-DFW having starttime as "03:30:00 am".
here you can see we have to assign the scheduleno of previously considered record to the chained schedule so we have updated the "schedule_ref" as 101. Also we have to assign the engine no - <counter of integer> to a single chain on schedule
given in engine_ref.
Regards
Kumud -
BTREE and duplicate data items : over 300 people read this,nobody answers?
I have a btree consisting of keys (a 4 byte integer) - and data (a 8 byte integer).
Both integral values are "most significant byte (MSB) first" since BDB does key compression, though I doubt there is much to compress with such small key size. But MSB also allows me to use the default lexical order for comparison and I'm cool with that.
The special thing about it is that with a given key, there can be a LOT of associated data, thousands to tens of thousands. To illustrate, a btree with a 8192 byte page size has 3 levels, 0 overflow pages and 35208 duplicate pages!
In other words, my keys have a large "fan-out". Note that I wrote "can", since some keys only have a few dozen or so associated data items.
So I configure the b-tree for DB_DUPSORT. The default lexical ordering with set_dup_compare is OK, so I don't touch that. I'm getting the data items sorted as a bonus, but I don't need that in my application.
However, I'm seeing very poor "put (DB_NODUPDATA) performance", due to a lot of disk read operations.
While there may be a lot of reasons for this anomaly, I suspect BDB spends a lot of time tracking down duplicate data items.
I wonder if in my case it would be more efficient to have a b-tree with as key the combined (4 byte integer, 8 byte integer) and a zero-length or 1-length dummy data (in case zero-length is not an option).
I would loose the ability to iterate with a cursor using DB_NEXT_DUP but I could simulate it using DB_SET_RANGE and DB_NEXT, checking if my composite key still has the correct "prefix". That would be a pain in the butt for me, but still workable if there's no other solution.
Another possibility would be to just add all the data integers as a single big giant data blob item associated with a single (unique) key. But maybe this is just doing what BDB does... and would probably exchange "duplicate pages" for "overflow pages"
Or, the slowdown is a BTREE thing and I could use a hash table instead. In fact, what I don't know is how duplicate pages influence insertion speed. But the BDB source code indicates that in contrast to BTREE the duplicate search in a hash table is LINEAR (!!!) which is a no-no (from hash_dup.c):
while (i < hcp->dup_tlen) {
memcpy(&len, data, sizeof(db_indx_t));
data += sizeof(db_indx_t);
DB_SET_DBT(cur, data, len);
* If we find an exact match, we're done. If in a sorted
* duplicate set and the item is larger than our test item,
* we're done. In the latter case, if permitting partial
* matches, it's not a failure.
*cmpp = func(dbp, dbt, &cur);
if (*cmpp == 0)
break;
if (*cmpp < 0 && dbp->dup_compare != NULL) {
if (flags == DB_GET_BOTH_RANGE)
*cmpp = 0;
break;
What's the expert opinion on this subject?
Vincent
Message was edited by:
user552628Hi,
The special thing about it is that with a given key,
there can be a LOT of associated data, thousands to
tens of thousands. To illustrate, a btree with a 8192
byte page size has 3 levels, 0 overflow pages and
35208 duplicate pages!
In other words, my keys have a large "fan-out". Note
that I wrote "can", since some keys only have a few
dozen or so associated data items.
So I configure the b-tree for DB_DUPSORT. The default
lexical ordering with set_dup_compare is OK, so I
don't touch that. I'm getting the data items sorted
as a bonus, but I don't need that in my application.
However, I'm seeing very poor "put (DB_NODUPDATA)
performance", due to a lot of disk read operations.In general, the performance would slowly decreases when there are a lot of duplicates associated with a key. For the Btree access method lookups and inserts have a O(log n) complexity (which implies that the search time is dependent on the number of keys stored in the underlying db tree). When doing put's with DB_NODUPDATA leaf pages have to be searched in order to determine whether the data is not a duplicate. Thus, giving the fact that for each given key (in most of the cases) there is a large number of data items associated (up to thousands, tens of thousands) an impressive amount of pages have to be brought into the cache to check against the duplicate criteria.
Of course, the problem of sizing the cache and databases's pages arises here. Your size setting for these measures should tend to large values, this way the cache would be fit to accommodate large pages (in which hundreds of records should be hosted).
Setting the cache and the page size to their ideal values is a process of experimenting.
http://www.oracle.com/technology/documentation/berkeley-db/db/ref/am_conf/pagesize.html
http://www.oracle.com/technology/documentation/berkeley-db/db/ref/am_conf/cachesize.html
While there may be a lot of reasons for this anomaly,
I suspect BDB spends a lot of time tracking down
duplicate data items.
I wonder if in my case it would be more efficient to
have a b-tree with as key the combined (4 byte
integer, 8 byte integer) and a zero-length or
1-length dummy data (in case zero-length is not an
option). Indeed, these should be the best alternative, but testing must be done first. Try this approach and provide us with feedback.
You can have records with a zero-length data portion.
Also, you could provide more information on whether or not you're using an environment, if so, how did you configure it etc. Have you thought of using multiple threads to load the data ?
Another possibility would be to just add all the
data integers as a single big giant data blob item
associated with a single (unique) key. But maybe this
is just doing what BDB does... and would probably
exchange "duplicate pages" for "overflow pages"This is a terrible approach since bringing an overflow page into the cache is more time consuming than bringing a regular page, and thus performance penalty results. Also, processing the entire collection of keys and data implies more work from a programming point of view.
Or, the slowdown is a BTREE thing and I could use a
hash table instead. In fact, what I don't know is how
duplicate pages influence insertion speed. But the
BDB source code indicates that in contrast to BTREE
the duplicate search in a hash table is LINEAR (!!!)
which is a no-no (from hash_dup.c):The Hash access method has, as you observed, a linear search (and thus a search time and lookup time proportional to the number of items in the buckets, O(1)). Combined with the fact that you don't want duplicate data than hash using the hash access method may not improve performance.
This is a performance/tunning problem and it involves a lot of resources from our part to investigate. If you have a support contract with Oracle, then please don't hesitate to put up your issue on Metalink or indicate that you want this issue to be taken in private, and we will create an SR for you.
Regards,
Andrei -
Upload multiple files to a data item in sharepoint list
The image above shows a list item with two pdf files attached to it. This is an access databse that was pushed to this sharepoint list. When we attached these files we used the "attach file" from the edit menu at the top of the page.
They are put into a data item called "copy of sepration report", which I can't seem to find when I edit the list. As a further on discussion of this we would like to be able to upload multiple items into their own data field. I.E.
one could be seperation report, and another could be accidents, and another would be disciplinary. Each would have the capability of having multiple items uploaded to it.
What am I missing????Since you can't attach document to list item field, you may need to think other way around. You can create a document library and have the document library all these fields (separation report, copy of separation report etc.). So instead of list item having
the documents attached, the document library will have the fields attached. Also you can group the fields into two groups - fields that are not directly related to document and fields that are directly related to document. Then you can move the document related
fields to document library and create another list with the non-related-to-document fields and linking this new list to document library using lookup
Thanks,
Sohel Rana
http://ranaictiu-technicalblog.blogspot.com -
How to use multiple data controls in a single JSF page
Hi,
I am using Essbase Data Control to in my project to get Essbase Cube data into ADF table/Pivot Table.
Suppose if [Item] dimension has this hierarchy [category] -> [Segment] -> [Brand] -> - [Item]
I need to display Category in one column and Segment in the next Column and [Brand] in the another column.
Different types of Category, Segment and Brands should display as table data(data values).
Using MDX query I can not print [Item] hierarchy in different columns..so I am planning to use multiple data controls.
Could any body help me to get the solution.Hi,
sounds like you want to try the Pivot table
http://download.oracle.com/docs/cd/E21764_01/apirefs.1111/e12418/tagdoc/dvt_pivotTable.html
I don't think that multiple Data Controls is the solution to the problem
Frank -
Va01 upload using call transaction,item details r overwrite ,how to solve
hi experts
i here attached the coding what i did,while page down item details r over overwrite, how to sole ,give me the suggestions
INCLUDE BDCRECX1.
PARAMETERS: DATASET(132) LOWER CASE.
DATA: BEGIN OF RECORD_HEAD,
VBELN(10),
data element: AUART
AUART(004),
data element: VKORG
VKORG(004),
data element: VTWEG
VTWEG(002),
data element: SPART
SPART(002),
DATA ELEMENT: KUNAG
KUNNR(010),
data element: KUNWE
KUNNR_007(010),
data element: BSTKD
BSTKD(035),
END OF RECORD_HEAD.
DATA: BEGIN OF RECORD_ITEM,
data element: VBELN
VBELN(10),
data element: MATNR
MABNR(018),
data element: KWMENG
KWMENG(019),
END OF RECORD_ITEM.
End generated data section ***
DATA: IT_SO_HEAD LIKE TABLE OF RECORD_HEAD,
IT_SO_ITEM LIKE TABLE OF RECORD_ITEM,
FLAG.
data: counter type num value '1'.
START-OF-SELECTION.
CALL FUNCTION 'GUI_UPLOAD'
EXPORTING
FILENAME = 'C:\USERS\SANGEETHA\DESKTOP\SO_HEADER.TXT'
FILETYPE = 'ASC'
HAS_FIELD_SEPARATOR = 'X'
TABLES
DATA_TAB = IT_SO_HEAD.
CALL FUNCTION 'GUI_UPLOAD'
EXPORTING
FILENAME = 'C:\USERS\SANGEETHA\DESKTOP\SO_ITEM.txt'
FILETYPE = 'ASC'
HAS_FIELD_SEPARATOR = 'X'
TABLES
DATA_TAB = IT_SO_ITEM.
LOOP AT IT_SO_HEAD INTO RECORD_HEAD.
AT NEW VBELN.
FLAG = 'X'.
ENDAT.
PERFORM BDC_DYNPRO USING 'SAPMV45A' '0101'.
PERFORM BDC_FIELD USING 'BDC_CURSOR'
'VBAK-AUART'.
PERFORM BDC_FIELD USING 'BDC_OKCODE'
'/00'.
PERFORM BDC_FIELD USING 'VBAK-AUART'
RECORD_HEAD-AUART.
PERFORM BDC_FIELD USING 'VBAK-VKORG'
RECORD_HEAD-VKORG.
PERFORM BDC_FIELD USING 'VBAK-VTWEG'
RECORD_HEAD-VTWEG.
PERFORM BDC_FIELD USING 'VBAK-SPART'
RECORD_HEAD-SPART.
PERFORM BDC_DYNPRO USING 'SAPMV45A' '4001'.
PERFORM BDC_FIELD USING 'BDC_OKCODE'
'/00'. " Enter
PERFORM BDC_FIELD USING 'KUAGV-KUNNR'
RECORD_HEAD-KUNNR.
PERFORM BDC_FIELD USING 'KUWEV-KUNNR'
RECORD_HEAD-KUNNR_007.
PERFORM BDC_FIELD USING 'VBKD-BSTKD'
RECORD_HEAD-BSTKD.
PERFORM BDC_FIELD USING 'BDC_CURSOR'
'RV45A-MABNR(01)'.
LOOP AT IT_SO_ITEM INTO RECORD_ITEM WHERE VBELN = RECORD_HEAD-VBELN.
IF FLAG = 'X'.
PERFORM BDC_FIELD USING 'RV45A-MABNR(01)'
RECORD_ITEM-MABNR.
PERFORM BDC_FIELD USING 'RV45A-KWMENG(01)'
RECORD_ITEM-KWMENG.
PERFORM BDC_DYNPRO USING 'SAPMV45A' '4001'.
PERFORM BDC_FIELD USING 'BDC_OKCODE'
'=/00'.
PERFORM BDC_FIELD USING 'BDC_CURSOR'
'RV45A-MABNR(02)'.
CLEAR FLAG.
ELSE.
PERFORM BDC_FIELD USING 'RV45A-MABNR(02)'
RECORD_ITEM-MABNR.
PERFORM BDC_FIELD USING 'RV45A-KWMENG(02)'
RECORD_ITEM-KWMENG.
PERFORM BDC_FIELD USING 'BDC_CURSOR'
'RV45A-KWMENG(02)'.
PERFORM BDC_DYNPRO USING 'SAPMV45A' '4001'.
PERFORM BDC_FIELD USING 'BDC_OKCODE'
'=p++'.
PERFORM BDC_FIELD USING 'BDC_CURSOR'
'RV45A-MABNR(02)'.
ENDIF.
ENDLOOP.
PERFORM BDC_DYNPRO USING 'SAPMV45A' '4001'.
PERFORM BDC_FIELD USING 'BDC_OKCODE'
'=SICH'.
data : l_option like ctu_params.
l_option-dismode = CTUMODE.
l_option-updmode = CUPDATE.
l_option-defsize = 'X'.
CALL TRANSACTION 'VA01' USING BDCDATA
options from l_option
MESSAGES INTO MESSTAB.
CLEAR: BDCDATA,BDCDATA[].
ENDLOOP.John - Its best to use BAPI_SALESDOCU_CREATEFROMDATA1.
But if you are still going to do a BDC, Do not use the pagedown for new line items. Use the "Create Item" icon in the bottom of the screen. This way you can add as many items as you want and the cursor will always be in the second row of the table control.
Please check the code below: Always enter in the 2nd row even for the first item. It should work.
LOOP AT IT_SO_ITEM INTO RECORD_ITEM WHERE VBELN = RECORD_HEAD-VBELN.
PERFORM BDC_FIELD USING 'RV45A-MABNR(02)'
RECORD_ITEM-MABNR.
PERFORM BDC_FIELD USING 'RV45A-KWMENG(02)'
RECORD_ITEM-KWMENG.
PERFORM BDC_DYNPRO USING 'SAPMV45A' '4001'.
PERFORM BDC_FIELD USING 'BDC_OKCODE'
'=POAN'.
ENDIF.
ENDLOOP
reward points if helpful -
Column chart with null data items
I have a Column Chart with an ArrayCollection data provider.
The data will sometimes have missing items in some "rows'. For
example, it might look like this where the second object has no
"Male" property:
public var myData:ArrayCollection = new ArrayCollection([
{School: "Priorford", All: 95, Male: 92, Female: 98},
{School: "Giffnock", All: 87, Female: 89},
{School: "Hastings", All: 80, Male: 78, Female: 82}
Sometimes I get the following error:
TypeError: Error #1009: Cannot access a property or method of
a null object
reference.
at mx.charts.chartClasses::NumericAxis/mapCache()
I'm assuming the error is because of missing data items, but
I'm not 100% sure. Some data sets with missing items will display
with no error. I can't see a pattern to those that don't.
-- I can't produce the error when using a "static" data
source set in the Flex app itself (such as the example above).
These always display correctly, even with missing items.
-- In the problem case, the source data comes from a MySQL
database through a PHP script that is called with a Flex
httpService. The returned data is parsed in Flex, and put into the
ArrayCollection. I was suspecting that the returned data was flawed
in some way, but the same ArrayCollection displays correctly in a
DataGrid. The problem is only with the Chart.
-- I can work round the problem by adding the missing
properties and setting them to an empty value. But I don't think I
should have to do this.
Has anyone seen this problem, or know of any obvious thing I
might be doing wrong.Thanks Arthur.
Your first suggestion is effectively what I'm doing as a
workaround. But it is a bit of a pain to have to account for this
each time when displaying a chart. It's a particular problem when
you don't know in advance what "categories" might be returned by
the data and that have to be charted.
My biggest concern is that this looks like a bug, and the
fact that it seems to occur intermittently is a bit worrying. (I
also know that eight out of ten reported "bugs" are user error,
which is why I was wondering if anyone else had experienced this.)
About your second suggestion, I think the interpolateValues
property only applies to lineSeries. -
My Client wants to restrict appearing used PO Line items in MIRO
Hi Gurus
My client wants to restrict appearing blank line items agaisnt already used Purchase Order Items (Full Qty GR / IR happened, GR Based IV) in MIRO so that user can not book another invoice against the same PO with diffrent ref. no. and in date.
Is there any way (User Exit / Screen Variant) we can make it?
Please reply.
Thanks
SanjibHi
Even if I set Final Invoice Indicator in MIRO and booked invoice against the PO systems is showing blank line items (Amt & Qty are blank) in MIRO agaisnt the PO.
Can you please run the scenario and tell me is there any way that we can restrict system showing blank line items against already processed PO items in MIRO. I have tried booking invoice agaisnt individual del note but blank line items are there.
Thanks
Sanjib -
When to use Drop In Item renderer and InLine Item Renderers ??
Hi ,
I am getting confused in where to use Inline ItemRenderer and DropIn Item Renderer .
What i feel is that DROP in Item Renderer are easy to use , and those can satisfy any requirements .
What i read from tutorilas that we cant use Drop In because they say , The only drawback to using drop in is that them is that you cannot configure them
Please help me .Hi Kiran,
Here is the detailed explanation you needed:
You can also refer the link below:
http://blog.flexdevelopers.com/2009/02/flex-basics-item-renderers.html
Drop-In Item Renderers
Drop-In Item Renderers are generic in nature and don't rely on specific data fields to render data. This allows them to be used with a wide range of data sets, hence, the term “drop-in”. Drop-In Item Renderers can be “dropped-in” to any list-based control regardless of the dataprovider’s data properties.
In our previous example, the employee photo property requires use of a custom Item Renderer to render properly in the UI. In this scenario the Image component satisfies our rendering needs out of the box. Implemented as a Drop-In Item Renderer, the Image component takes any data property regardless of name and uses it as the Image component's source property value. Assuming our employee photo property contains a valid image path, the Image Drop-In Item Renderer will work perfectly and resolve the image path as an image in the UI.
<!-- Drop-in Item Renderer: Image control -->
<mx:DataGridColumn dataField="photo"
headerText="Employee Photo"
itemRenderer="mx.controls.Image"/>
Drop-In Item Renderers are simple and easy to use and satisfy specific use cases nicely. However, they provide no flexibility whatsoever. If your needs are not satisfied by a Drop-In Item Renderer, you must create your own Item Renderer as an inline component or an external component.
Inline Item Renderers
Generally used for simple item rendering requiring minimal customization, inline Item Renderers are defined as a component nested within the MXML of your list-based control.
It is important to note that Item Renderers nested within the itemrender property of a list-based control occupy a different scope than the list-based control. Any attempt to reference members (properties or methods) of the parent component from the nested Item Renderer component will result in a compile-time error. However, references to the members of the parent component can be achieved by utilizing the outerDocument object.
<mx:DataGrid id="myGrid" dataProvider="{gridData}">
<mx:columns>
<mx:DataGridColumn headerText="Show Relevance">
<mx:itemRenderer>
<mx:Component>
<mx:Image source="{'assets/images/indicator_' + data.showRelevance + '.png'}"
toolTip="{(data.showRelevance == 1) ? 'On' : 'Off'}"
click="outerDocument.toggle()" />
</mx:Component>
</mx:itemRenderer>
</mx:DataGridColumn>
</mx:columns>
</mx:DataGrid>
Remember, rules of encapsulation still apply. Mark all properties or methods public if you want them accessible by your inline Item Renderer. In the previous example, the toggle() method must have a public access modifier to expose itself to the inline Item Renderer.
public function toggle():void
Inline Item Renderers can also be reusable by creating a named component instance outside of the list-based control. This component must have an id property and contain the rendering logic of the Item Renderer. Using data bindings, the component is assigned to the itemrenderer property of one or more data properties of a list-based control.
<!-- Reusable inline Item Renderer -->
<mx:Component id="ImageRenderer">
<mx:VBox width="100%" height="140"
horizontalAlign="center" verticalAlign="middle">
<mx:Image source="{'assets/'+data.image}"/>
<mx:Label text="{data.image}" />
</mx:VBox>
</mx:Component>
<!-- Used within a list-based control-->
<mx:DataGridColumn headerText="Image"
dataField="image" width="150"
itemRenderer="{ImageRenderer}"/>
In the previous example, note that the Item Renderer component contains 2 UI controls – Image and Label. When using multiple controls within an Item Renderer, a layout container is required. In this example, a VBox was used.
If this post answers your question or helps, please kindly mark it as such.
Thanks,
Bhasker Chari
Message was edited by: BhaskerChari -
How to validate a date item "DD/MM/YYYY HH24:MI" ?
Hi,
I want to create a validation item. I have a date item usinhg this mask : "DD/MM/YYYY HH24:MI".
I want to validate if a date entered according to this syntax "DD/MM/YYYY HH24:MI"
Unfortunately APEX gives me only the choice to validate if a date is under the form "DD/MM/YYYY" which gives this syntax :
^([012]?[[:digit:]]|3[01])/(0?[[:digit:]]|1[012])/[[:digit:]]{4}$I don't know this syntax, do you know how I could transform it (I mean adding the hours and minutes criteria) to attain my goal ?
Sorry for my english, not perfect, today !
Regards,
ChristianTry this:
SELECT REGEXP_INSTR('04-OCT-2011 23:12',
'^([012]?[[:digit:]]|3[01])-(JAN|FEB|MAR|APR|MAY|JUN|JUL|AUG|SEP|OCT|NOV|DEC)-[[:digit:]]{4}[[:space:]][0-2][0-9]:[0-9][0-6]$') test_result
FROM dualFor case-insenstive matching, use below:
SELECT REGEXP_INSTR('04-OCT-2011 23:12',
'^([012]?[[:digit:]]|3[01])-(JAN|FEB|MAR|APR|MAY|JUN|JUL|AUG|SEP|OCT|NOV|DEC)-[[:digit:]]{4}[[:space:]][0-2][0-9]:[0-9][0-6]$',1,1,0,'i') test_result
FROM dualThanks!
JMcG -
Combo Box Use Global Data issue
I am using a drop down combo box to select a name, and have checked specifiy item values, am using the item numbers in a switch statement to populate other text boxes depending on selection of drop down. These same items appear several times in the form, so am setting them to use global data. All work great except for the textboxes bound using global data to the drop down. In those results, it's putting the item number, not the text value. How do I get the text value to show instead of the item value? Thank you!
Resolved it by changing the switch statement to use the text value instead of item number, and unchecked the specify item values in the combo box. I would be interested for future use to learn how to do this, but no worries if not. At least the form is operating the way it needs to. Thanks.
Maybe you are looking for
-
Calling all System Center users! May TechNet Gurus announced!
The results for May's TechNet Guru competition have been posted! http://blogs.technet.com/b/wikininjas/archive/2014/01/16/technet-guru-awards-december-2013.aspx Congratulations to all our new Gurus for May! We will be interviewing some of the winners
-
Receiver file channel with ANSI format.
Hi, I have developed file to file scenario for which sender file format is unicode and receiver file format is ANSI. There is no FCC at sender and receiver site. In interface we are just bypassing the file in PI, there is no ESR objects. I have confi
-
Replaced hard drive; now Macbook wont boot from CD (or USB for that matter)
Looking for a genius. I've tried holding down "C" during startup until the option to boot from CD shows up, but my install disk spins, then stops. Does this about 3 times and the Macbook spits it out (I held "C" the entire time). I've also tried hold
-
How to consume OData from SNWG in Windows Phone
Hi Friends, I want to consume OData from SNWG in Windows Phone 8 app. I already have a post on this. Please refer the 2nd last post for what I have tried finally. Getting Started with SAP NetWeaver Gateway Regards Somnath
-
With the new I tunes I can't see all the uploded CD anymore. How can I do this?
with the new I tunes I can't see all the uploded CD anymore. How can I do this?