Calculating equally spaced values over a range
Any help would be really appreciated,
I've bee asked: Calculate Range for a sequence of 101 equally-spaced angles in the range from -90 to +90.
Basically I need 101 values equally spaced between -90 and 90 each of these values needs inserted into a range formula and graph.
This is the vi im working on, the unassigned T input variables on my forumula nodes are the angle values I need to calculate.
Or use the Ramp Pattern function to create a 1D array of equally distributed values between two limits (-90 and 90).
Thoric (CLA, CLED, CTD and LabVIEW Champion)
Similar Messages
-
SUM(Case how to use this structure to get average values over date range
I am using:
Oracle SQL Developer (3.0.04) Build MAin-04.34 Oracle Database 11g Enterprise Edition 11.2.0.1.0 - 64bit Production
How do I use the sum function with a case structure inside.
so I have data that looks like has an ID, date, and value. I am looking to get the 7 day average for the date range of 4/1/2013 thru 4/20/2013
with t as (
select 1 ID_Key,to_date('4/1/2013','mm-dd-yyyy') date_val, 10 Value_num from dual union all
select 1 ID_key,to_date('4/2/2013','mm-dd-yyyy'), 15 from dual union all
select 1 ID_key,to_date('4/3/2013','mm-dd-yyyy'), 20 from dual union all
select 1 ID_key,to_date('4/5/2013','mm-dd-yyyy'), 0 from dual union all
select 1 ID_key,to_date('4/8/2013','mm-dd-yyyy'), 12 from dual union all
select 1 ID_key,to_date('4/9/2013','mm-dd-yyyy'), 8 from dual union all
select 1 ID_key,to_date('4/10/2013','mm-dd-yyyy'), 6 from dual union all
select 1 ID_key,to_date('4/12/2013','mm-dd-yyyy'), 10 from dual union all
select 1 ID_key,to_date('4/13/2013','mm-dd-yyyy'), 0 from dual union all
select 1 ID_key,to_date('4/14/2013','mm-dd-yyyy'), 0 from dual union all
select 1 ID_key,to_date('4/15/2013','mm-dd-yyyy'), 10 from dual union all
select 1 ID_key,to_date('4/16/2013','mm-dd-yyyy'), 5 from dual union all
select 1 ID_key,to_date('4/17/2013','mm-dd-yyyy'), 2 from dual union all
select 1 ID_key,to_date('4/20/2013','mm-dd-yyyy'), 3 from dual union all
select 2 ID_key,to_date('4/3/2013','mm-dd-yyyy'), 12 from dual union all
select 2 ID_key,to_date('4/5/2013','mm-dd-yyyy'), 15 from dual union all
select 2 ID_key,to_date('4/6/2013','mm-dd-yyyy'), 5 from dual union all
select 2 ID_key,to_date('4/7/2013','mm-dd-yyyy'), 7 from dual union all
select 2 ID_key,to_date('4/9/2013','mm-dd-yyyy'), 10 from dual union all
select 2 ID_key,to_date('4/11/2013','mm-dd-yyyy'), 5 from dual union all
select 2 ID_key,to_date('4/12/2013','mm-dd-yyyy'), 0 from dual union all
select 2 ID_key,to_date('4/13/2013','mm-dd-yyyy'), 0 from dual union all
select 2 ID_key,to_date('4/15/2013','mm-dd-yyyy'), 6 from dual union all
select 2 ID_key,to_date('4/16/2013','mm-dd-yyyy'), 8 from dual union all
select 2 ID_key,to_date('4/17/2013','mm-dd-yyyy'), 0 from dual union all
select 2 ID_key,to_date('4/18/2013','mm-dd-yyyy'), 10 from dual union all
select 2 ID_key,to_date('4/19/2013','mm-dd-yyyy'), 5 from dual
)**Please let me know if the table does not load.
I would like to get the 7 day average as long as there is date for that row has enough previous dates, it not then it will return null.
the results should look like this
ID_Key date_val Value_num 7Day_Avg 7Day_Avg2
1 4/1/2013 10 null null
1 4/2/2013 15 null null
1 4/3/2013 20 null null
1 4/5/2013 0 null null
1 4/8/2013 12 6.71 11.75
1 4/9/2013 8 5.71 10.00
1 4/10/2013 6 3.71 6.50
1 4/12/2013 10 5.14 9.00
1 4/13/2013 0 5.14 7.20
1 4/14/2013 0 5.14 6.00
1 4/15/2013 10 4.86 5.67
1 4/16/2013 5 4.42 5.17
1 4/17/2013 2 3.85 4.50
1 4/20/2013 3 2.86 4.00
2 4/3/2013 12 null null
2 4/5/2013 15 null null
2 4/6/2013 5 null null
2 4/7/2013 7 5.57 9.75
2 4/9/2013 10 7.00 9.80
2 4/11/2013 5 6.00 8.40
2 4/12/2013 0 3.86 5.40
2 4/13/2013 0 3.14 4.40
2 4/15/2013 6 3.00 4.20
2 4/16/2013 8 2.71 3.80
2 4/17/2013 0 2.71 3.17
2 4/18/2013 10 3.43 4.00
2 4/19/2013 5 4.14 4.83As you may notice, there are gaps in the dates, so the value are then treated as zeros for the 7Day_Avg and then ignored for teh 7Day_Avg2 (not counted as number of days averaged do to no valu_num row)
I was trying something like this to start, but getting error "missing keyword"
select
t.*/,
sum(
case
when date_val between :day2 - 6 and :day2
then value_num between date_val - 6 and date_val
else null
end
as 7Day_avg
form tShould I have the case structure outside the sum function?
Any thoughts??
Edited by: 1004407 on Jun 7, 2013 11:06 AMHi,
If you want the average of the last 7 days, including the current day, then then RANGE should be 6 PRECEDING, not 7.
Try this:
WITH got_min_date_val AS
SELECT id_key, date_val, value_num
, MIN (date_val) OVER () AS min_date_val
FROM t
WHERE date_val BETWEEN TO_DATE ('04-01-2013', 'mm-dd-yyyy')
AND TO_DATE ('04-20-2013', 'mm-dd-yyyy')
SELECT id_key, date_val, value_num
, CASE
WHEN date_val >= min_date_val + 6
THEN SUM (value_num) OVER ( PARTITION BY id_key
ORDER BY date_val
RANGE 6 PRECEDING
/ 7
END AS avg_7_day
, CASE
WHEN date_val >= min_date_val + 6
THEN AVG (value_num) OVER ( PARTITION BY id_key
ORDER BY date_val
RANGE 6 PRECEDING
END AS avg_7_day_2
FROM got_min_date_val
ORDER BY id_key
, date_val
Output:
ID_KEY DATE_VAL VALUE_NUM AVG_7_DAY AVG_7_DAY_2
1 01-APR-13 10
1 02-APR-13 15
1 03-APR-13 20
1 05-APR-13 0
1 08-APR-13 12 6.71 11.75
1 09-APR-13 8 5.71 10.00
1 10-APR-13 6 3.71 6.50
1 12-APR-13 10 5.14 9.00
1 13-APR-13 0 5.14 7.20
1 14-APR-13 0 5.14 6.00
1 15-APR-13 10 4.86 5.67
1 16-APR-13 5 4.43 5.17
1 17-APR-13 2 3.86 4.50
1 20-APR-13 3 2.86 4.00
2 03-APR-13 12
2 05-APR-13 15
2 06-APR-13 5
2 07-APR-13 7 5.57 9.75
2 09-APR-13 10 7.00 9.80
2 11-APR-13 5 6.00 8.40
2 12-APR-13 0 3.86 5.40
2 13-APR-13 0 3.14 4.40
2 15-APR-13 6 3.00 4.20
2 16-APR-13 8 2.71 3.80
2 17-APR-13 0 2.71 3.17
2 18-APR-13 10 3.43 4.00
2 19-APR-13 5 4.14 4.83
Message was edited by: FrankKulash
Sorry; I meant to reply to OP, not to Greg -
DP macros, calculating average values over a period of time
Hello
I have a key figure row, for the future i want this key figure to contain the average of a different key figure row over the last year.
How would you go about calculating the average value of a key figure row over a period of time and then assigning this value to another key figure?
I've tried variatons of AVG() and SUM() & SUM_CALC a but none of them seem to get me anywhere, i may not understand completely how rows & values work so any tips would be helpful.
Iin pseudo-logic: what i need to do is:
Calculate the average value of a key figure row over a given period (the last year)
Store this value somewhere, in regular programming it'd be a variable of some kind.
Assign this value to another key figure row 18 months into the future.
Regards
Simon Pedersen<H5>Hi Simon,
If you are a technical guy, you can create a BADI implementation for that macro and manipulate the matrix data like the way you want.
the procedure to implement a BADI is
1.SPRO --> SAP SCM - Implementation Guide --> Advanced Planning and Optimization --> Supply Chain Planning --> Demand Planning (DP) --> Business Add-Ins (BAdIs) --> MacroBuilder --> Additional Functions for Macros.
create a new implementation by copying the class for the BADI defenition '/SAPAPO/ADVX' and write your own code in the method '/SAPAPO/IF_EX_ADVX~USER_EXIT_MACRO '. There is a sample code and proceedure explaining how to handle the data in the internal tables C_T_TAB and C_T_TAB_OLD. the calculations can be made with help of I_T_LINES, I_T_COLS which are rows and columns tables.
find out the the row and columns of the grid to be read and do calculation and then put the result in the desired cell.
Please let me know if you need further assistance.
Regards,
Srini.
Award points for the helpful answers <H5> -
How to calculate query value on extended range of data
Hi,
This is my first post, so let me greet all forum users. I'm reading the forum for few weeks and I'm really impressed with the force of this community.
My question is based on some real problems but I'll ask in general:
In a BEx query - is it possible (and how?) in a result cell to obtain value that is calculated based on some more extended range of infoprovider records than it results from all the characteristics related to that cell?
The question is somehow about a reverse case to the restriction in Selection. I'm asking about a kind of 'extension' of selection.
Example for the question follows (it is somehow simple, I put it here just to picture the problem, note: the question is more general!)
cube: ch: 0CALMONTH, 0MATERIAL, 0PLANT
kf: 0QUANTITY (ex. of outgoing deliveries)
What I need is to obtain in any query the SUM(0QUANTITY) over all 0PLANT (this can lead to knowing the 'activity' of the specific plant by calculating the share. that activity can be calculated at diferent level of detail).
The 'dream' query would look like:
rows: 0CALMONTH, 0MATERIAL, 0PLANT
cols: 0QUANTITY SUM_OVER_0PLANT
resulting in ex:
0CALMONTH 0MATERIAL 0PLANT 0QUANTITY SUM_OVER_0PLANT PLANT_ACTIVITY
2005.01 00001 P100 10 30 1/3
2005.01 00001 P200 20 30 2/3
2005.01 00002 P100 30 70 3/7
2005.01 00002 P200 40 70 4/7
2005.02 00001 P100 50 110 5/11
2005.02 00001 P200 60 110 6/11
2005.02 00002 P100 70 150 7/15
2005.02 00002 P200 80 150 8/15
after removing drill 0MATERIAL:
0CALMONTH 0PLANT 0QUANTITY SUM_OVER_0PLANT PLANT_ACTIVITY
2005.01 P100 40 100 4/10
2005.01 P200 60 100 6/10
2005.02 P100 120 260 12/26
2005.02 P200 140 260 14/26
after removing drill 0CALMONTH:
0PLANT 0QUANTITY SUM_OVER_0PLANT PLANT_ACTIVITY
P100 160 360 16/36
P200 200 360 20/36
For this specific case I investigated several solutions:
1) using formula & function SUMCT (Result). This has a drawback that Result for 0PLANT must be present in right place (bottom level) in the resulting query. I don't like this.
2) SUM can be directly precalculated in the cube. Well, I'd prefer Bex only...
Please keep in mind also following:
1) there may be a need to refer to the data that is outside prompt/filtering/restricting range.
2) I want to preserve all the OLAP freedom, so all the solution should be a query with some tricky formula/kf/??? to be used by user in any situation and producing right result.
I hope I'm not demanding to much...
(now after this long example please have a look at the question again!)
Regards,
MirekHi Ashwin,
while using SUMCT I loose the freedom of OLAP. SUMCT calculates properly SUM_OVER_0PLANT only when the 0PLANT is the lowest drillin level. If this is not the case (ie. some other ch is the lowest level or 0PLANT results are suppressed) the approach won't provide right result.
sure I will reward all helpfull posts.
regards,
Mirek -
Equal spacing between the images of uniformed height but variable widths with AS3?
I have to figure out how to insert images with variable width with equal spacing intervals.
Here is where I got so far:
function buildScroller(imageList:XMLList):void{
trace("build Scroller");
for (var item:uint = 0; item<imageList.length();item++) {
var thisOne:MovieClip = new MovieClip();
var currentX = 90;
var spaceBetween = 20;
var currentImageWidth = Number(imageList[item].attribute("width"));
thisOne.x = currentX;
thisOne.x = (currentImageWidth+spaceBetween)*item;
I can see that my images are being spread out on the page and if I change the number in var spaceBetween it affects the spacing. However the spacing is not uniformed. I can not figure why. Perhaps because I can not properly retrieve the image widths from the xml file. I assigned a width in xml file in the following manner:
<images>
<image src="appThmb_imgs/A-illuminatorUpLit_xsm.jpg" title="UpDownGlowingVase" url="http://www.888acolyte.com" width="132"/>
<image src="appThmb_imgs/ATI-1-bgpA_xsm.jpg" title="CoolingReflections" url="http://www.888acolyte.com" width="117"/>
<image src="appThmb_imgs/ATI-2-zenC_xsm.jpg" title="OrchidsUnderGlass" url="http://www.888acolyte.com" width="263"/>
<image src="appThmb_imgs/SilverBloom_RGB_xsm.jpg" title="SilverBloom" url="http://www.888acolyte.com" width="148"/>
</images>
they correspond to actual image width. I do however want them to be scaled at 50% of their actual width.
Any ideas if I am missing a line of code or don't call out images properly?Thank You for your reply.
I tried to implement it but images are still jambled together.
Perhaps it has something to do with the fact that I specify the actual width of the image in the XML file but then define it as a scaled value in the code later on (11th line at the bottom of the code):
mc.height = 110;
This height definition will just give the width proportional to the height. Does it mean that I can not specify the height as it is?
Also can be there a mistake in the XML file? I just assigned the width number as it is in pixels for each image. Here is an example:
<images>
<image src="appThmb_imgs/RosesGallasGalore_RGB_xsm.jpg" title="RosesGallasGalore" url="http://www.888acolyte.com" width="131"/>
<image src="appThmb_imgs/SangriaBling_RGB_xsm.jpg" title="SangriaBling" url="http://www.888acolyte.com" width="233"/>
<image src="appThmb_imgs/SilverBloom_RGB_xsm.jpg" title="SilverBloom" url="http://www.888acolyte.com" width="148"/>
</images>
There were no errors in the output or compiler errors tabs.
Here is my code in its entirety:
import com.greensock.*;
import com.greensock.easing.*;
//load xml
var xmlLoader:URLLoader = new URLLoader();
/////Parse XML
var xmlData:XML = new XML();
var xmlPath:String = "app_thmbs_imgsModfd.xml";
xmlLoader.load(new URLRequest(xmlPath));
trace("loading xml from: " + xmlPath);
xmlLoader.addEventListener(Event.COMPLETE, LoadXML);
function LoadXML(e:Event):void {
trace("xml load complete");
xmlData = new XML(e.target.data);
buildScroller(xmlData.image);
/////Build Scroller MovieClip to Contain Each Image
var scroller:MovieClip = new MovieClip();
this.addChild(scroller);
scroller.y = 30;
/////Parse XML
//build scroller from xml
var spaceBetween:int = 20;
function buildScroller(imageList:XMLList):void{
trace("build Scroller");
var nextX:int=spaceBetween; //not sure where you want to start;
for (var item:uint = 0; item<imageList.length();item++) {
var thisOne:MovieClip = new MovieClip();
thisOne.x=nextX;
nextX=int(imageList[item].attribute("width"))+spaceBetween;
//outline
var blackBox:Sprite = new Sprite();
blackBox.graphics.beginFill(0xFFFFFF);
blackBox.graphics.drawRect(-1, -1, 124, 107);
thisOne.addChild(blackBox);
thisOne.itemNum = item;
thisOne.title = imageList[item].attribute("title");
thisOne.link = imageList[item].attribute("url");
thisOne.src = imageList[item].attribute("src");
thisOne.alpha = 0;
//Loading and Adding the Images
//image container
var thisThumb:MovieClip = new MovieClip();
//add image
var ldr:Loader = new Loader();
var url:String = imageList[item].attribute("src");
var urlReq:URLRequest = new URLRequest(url);
trace("loading thumbnail "+item+" into Scroller: " + url);
//assign event listeners for Loader
ldr.contentLoaderInfo.addEventListener(Event.COMPLETE,completeHandler);
ldr.contentLoaderInfo.addEventListener(IOErrorEvent.IO_ERROR, errorHandler);
ldr.load(urlReq);
thisThumb.addChild(ldr);
thisOne.addChild(thisThumb);
//create listeners for this thumb
thisOne.buttonMode = true;
thisOne.addEventListener(MouseEvent.CLICK, clickScrollerItem);
thisOne.addEventListener(MouseEvent.MOUSE_OVER, overScrollerItem);
thisOne.addEventListener(MouseEvent.MOUSE_OUT, outScrollerItem);
//add item
scroller.addChild(thisOne);
trace("termination of build scroller");
function clickScrollerItem(e:MouseEvent):void{
trace("clicked item " +e.currentTarget.itemNum + " - visit url: " +e.currentTarget.link);
function overScrollerItem(e:MouseEvent):void{
trace("over"+e.currentTarget.title);
function outScrollerItem(e:MouseEvent):void{
trace("out"+e.currentTarget.title);
function completeHandler(e:Event):void{
//trace("thumbnail complete "+e.target.loader.parent.parent.title)
TweenMax.to(e.target.loader.parent.parent, .5, {alpha:1});
//size image into scroller
resizeMe(e.target.loader.parent, 140, 105, true, true, false);
function errorHandler(e:IOErrorEvent):void{
trace("thumbnail error="+e);
function resizeMe(mc:DisplayObject, maxH:Number, maxW:Number=0, constrainProportions:Boolean=true, centerHor:Boolean=true, centerVert:Boolean=true):void{
maxH = maxH == 0 ? maxW : maxH;
mc.width = maxW;
mc.height = 110;
mc.scaleX=mc.scaleY
if (centerHor) {
mc.x = (maxW - mc.width) / 2;
if (centerVert){
mc.y = (maxH - mc.height) / 2; -
How to get top 11 values per date range
I want to get the top 11 values by date range.
Sample Data
CREATE TABLE SAMPLE_DATA
DOMAIN_NAME VARCHAR2(100),
QTD NUMBER,
LOAD_DATE DATE
-- Insert
BEGIN
FOR lc IN 1..20
LOOP
FOR ld IN 1..30
LOOP
INSERT
INTO SAMPLE_DATA VALUES
'DM_'
||lc,
round(dbms_random.value(0,1000)),
SYSDATE-ld
END LOOP;
END LOOP;
COMMIT;
END;
SELECT *
FROM
(SELECT DOMAIN_NAME,
QTD,
LOAD_DATE
FROM
(SELECT DOMAIN_NAME,
QTD,
LOAD_DATE
FROM SAMPLE_DATA
WHERE LOAD_DATE = TRUNC(SYSDATE-3)
ORDER BY QTD DESC
WHERE ROWNUM <=10
UNION ALL
SELECT 'Others' DOMAIN_NAME,
SUM(QTD) QTD,
LOAD_DATE
FROM
(SELECT DOMAIN_NAME,
QTD,
LOAD_DATE
FROM
(SELECT rownum rn,
DOMAIN_NAME,
QTD,
LOAD_DATE
FROM
(SELECT DOMAIN_NAME,
QTD,
LOAD_DATE
FROM SAMPLE_DATA
WHERE LOAD_DATE = TRUNC(SYSDATE-3)
ORDER BY QTD DESC
WHERE rn > 10
GROUP BY LOAD_DATE
ORDER BY QTD DESC
-- Result
DOMAIN_NAME QTD LOAD_DATE
Others 2888 24/03/13
DM_1 1000 24/03/13
DM_20 933 24/03/13
DM_11 913 24/03/13
DM_3 743 24/03/13
DM_13 572 24/03/13
DM_12 568 24/03/13
DM_9 564 24/03/13
DM_6 505 24/03/13
DM_5 504 24/03/13
DM_2 480 24/03/13
Please, Help me get in one query this result using a range of date.
e.g
using LOAD_DATE BETWEEN '24/03/13' AND '25/03/13'
DOMAIN_NAME QTD LOAD_DATE
Others 2888 24/03/13
DM_1 1000 24/03/13
DM_20 933 24/03/13
DM_11 913 24/03/13
DM_3 743 24/03/13
DM_13 572 24/03/13
DM_12 568 24/03/13
DM_9 564 24/03/13
DM_6 505 24/03/13
DM_5 504 24/03/13
DM_2 480 24/03/13
Others 1948 25/03/13
DM_1 807 25/03/13
DM_8 764 25/03/13
DM_7 761 25/03/13
DM_11 656 25/03/13
DM_18 611 25/03/13
DM_17 523 25/03/13
DM_14 467 25/03/13
DM_19 447 25/03/13
DM_15 437 25/03/13
DM_6 380 25/03/13 Thank you in advance.I got the solution. Just sharing.
I used analytic functions that make my job easy.
Sample Data
DOMAIN_NAME QTD LOAD_DATE
DM_1 807 25/03/2013
DM_1 1000 24/03/2013
DM_2 226 25/03/2013
DM_2 480 24/03/2013
DM_3 244 25/03/2013
DM_3 743 24/03/2013
DM_4 48 25/03/2013
DM_4 413 24/03/2013
DM_5 164 25/03/2013
DM_5 504 24/03/2013
DM_6 380 25/03/2013
DM_6 505 24/03/2013
DM_7 761 25/03/2013
DM_7 212 24/03/2013
DM_8 764 25/03/2013
DM_8 308 24/03/2013
DM_9 354 25/03/2013
DM_9 564 24/03/2013
DM_10 214 25/03/2013
DM_10 367 24/03/2013
DM_11 656 25/03/2013
DM_11 913 24/03/2013
DM_12 37 25/03/2013
DM_12 568 24/03/2013
DM_13 332 25/03/2013
DM_13 572 24/03/2013
DM_14 467 25/03/2013
DM_14 87 24/03/2013
DM_15 437 25/03/2013
DM_15 450 24/03/2013
DM_16 238 25/03/2013
DM_16 299 24/03/2013
DM_17 523 25/03/2013
DM_17 143 24/03/2013
DM_18 611 25/03/2013
DM_18 145 24/03/2013
DM_19 447 25/03/2013
DM_19 464 24/03/2013
DM_20 91 25/03/2013
DM_20 933 24/03/2013 Top 11 QTD of DOMAIN_NAME per Data Range.
SELECT *
FROM
(SELECT DOMAIN_NAME,
QTD,
LOAD_DATE
FROM
(SELECT LOAD_DATE,
DOMAIN_NAME ,
QTD,
(DENSE_RANK() OVER (PARTITION BY LOAD_DATE ORDER BY QTD DESC )) AS RANK_QTD
FROM SAMPLE_DATA
WHERE trunc(load_date) BETWEEN '24/03/2013' AND '25/03/2013'
WHERE RANK_QTD <= 10
UNION ALL
SELECT 'Others',
SUM(QTD) AS QTD,
LOAD_DATE
FROM
(SELECT LOAD_DATE,
DOMAIN_NAME ,
QTD,
(DENSE_RANK() OVER (PARTITION BY LOAD_DATE ORDER BY QTD DESC )) AS RANK_QTD
FROM SAMPLE_DATA
WHERE trunc(load_date) BETWEEN '24/03/2013' AND '25/03/2013'
WHERE RANK_QTD > 10
GROUP BY LOAD_DATE
ORDER BY LOAD_DATE ASC,
QTD DESC
DOMAIN_NAME QTD LOAD_DATE
Others 2888 24/03/2013
DM_1 1000 24/03/2013
DM_20 933 24/03/2013
DM_11 913 24/03/2013
DM_3 743 24/03/2013
DM_13 572 24/03/2013
DM_12 568 24/03/2013
DM_9 564 24/03/2013
DM_6 505 24/03/2013
DM_5 504 24/03/2013
DM_2 480 24/03/2013
Others 1948 25/03/2013
DM_1 807 25/03/2013
DM_8 764 25/03/2013
DM_7 761 25/03/2013
DM_11 656 25/03/2013
DM_18 611 25/03/2013
DM_17 523 25/03/2013
DM_14 467 25/03/2013
DM_19 447 25/03/2013
DM_15 437 25/03/2013
DM_6 380 25/03/2013 -
Select just the values between min and max of an accumulated value over day
Hello Forum,
a value is accumulated over a day and over a period of time. Next day the value is reseted and starts again to be accumulated:
with sampledata as (select to_date('09.09.2012 00:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
select to_date('09.09.2012 01:03:08', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
select to_date('09.09.2012 02:54:11', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
select to_date('09.09.2012 03:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
select to_date('09.09.2012 04:04:19', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
select to_date('09.09.2012 05:04:20', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
select to_date('09.09.2012 06:12:02', 'dd.mm.yyyy hh24:mi:ss') ts, 23 val from dual union all
select to_date('09.09.2012 07:12:03', 'dd.mm.yyyy hh24:mi:ss') ts, 29 val from dual union all
select to_date('09.09.2012 08:12:04', 'dd.mm.yyyy hh24:mi:ss') ts, 30 val from dual union all
select to_date('09.09.2012 09:12:11', 'dd.mm.yyyy hh24:mi:ss') ts, 45 val from dual union all
select to_date('09.09.2012 10:12:12', 'dd.mm.yyyy hh24:mi:ss') ts, 60 val from dual union all
select to_date('09.09.2012 11:12:13', 'dd.mm.yyyy hh24:mi:ss') ts, 75 val from dual union all
select to_date('09.09.2012 12:21:24', 'dd.mm.yyyy hh24:mi:ss') ts, 95 val from dual union all
select to_date('09.09.2012 13:21:26', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
select to_date('09.09.2012 14:21:27', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('09.09.2012 15:21:30', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('09.09.2012 16:21:32', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('09.09.2012 17:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('09.09.2012 21:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('09.09.2012 23:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('10.09.2012 00:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('10.09.2012 01:03:08', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('10.09.2012 02:54:11', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('10.09.2012 03:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('10.09.2012 04:04:19', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('10.09.2012 05:04:20', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('10.09.2012 06:12:02', 'dd.mm.yyyy hh24:mi:ss') ts, 14 val from dual union all
select to_date('10.09.2012 07:12:03', 'dd.mm.yyyy hh24:mi:ss') ts, 34 val from dual union all
select to_date('10.09.2012 08:12:04', 'dd.mm.yyyy hh24:mi:ss') ts, 58 val from dual union all
select to_date('10.09.2012 09:12:11', 'dd.mm.yyyy hh24:mi:ss') ts, 70 val from dual union all
select to_date('10.09.2012 10:12:12', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
select to_date('10.09.2012 11:12:13', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('10.09.2012 12:21:24', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
select to_date('10.09.2012 13:21:26', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
select to_date('10.09.2012 14:21:27', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
select to_date('10.09.2012 15:21:30', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
select to_date('10.09.2012 16:21:32', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
select to_date('10.09.2012 21:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual)
select ts, val
from sampledata
order by ts asc;How should I change the select statement to skip all data sets before the first minimum and the duplicates after the maximum of a day in order to get such a result:
TS VAL
09.09.12 06:12 23
09.09.12 07:12 29
09.09.12 08:12 30
09.09.12 09:12 45
09.09.12 10:12 60
09.09.12 11:12 75
09.09.12 12:21 95
09.09.12 13:21 120
09.09.12 14:21 142
10.09.12 06:12 14
10.09.12 07:12 34
10.09.12 08:12 58
10.09.12 09:12 70
10.09.12 10:12 120
10.09.12 11:12 142
10.09.12 12:21 153Thank youThis solution works perfectly when the accumulated value has its low and its high on the same day. But I found out :( , that there is also data, which has its low yesterday and its high today. For a better understandig of the case, there is a machine, wich is working over 3 Shifts with irregular start and end time. For example Shift1 cann start at 5:50 or at 7:15. The accumulated value of the worked time is accumuated for each shift extra. This solution works for the shift 1 (approximate between 06:00-14:00) and for the shift 2(approximate between 14:00-22:00), because there is the low and the high of the accumulated value on the same day. This solution does not work for the shif 3(approximate between 22:00-06:00), because the high of the accumulated value is or can be the next day.
So the thread title should be: "Select just the values between min and max of an accumulated value over the same day(today) or over two successive days (yesterday and today)
Sampledata for shift 1 or shift 2:
{code}
with sampledata as (select to_date('09.09.2012 00:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
select to_date('09.09.2012 01:03:08', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
select to_date('09.09.2012 02:54:11', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
select to_date('09.09.2012 03:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
select to_date('09.09.2012 04:04:19', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
select to_date('09.09.2012 05:04:20', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
select to_date('09.09.2012 06:12:02', 'dd.mm.yyyy hh24:mi:ss') ts, 23 val from dual union all
select to_date('09.09.2012 07:12:03', 'dd.mm.yyyy hh24:mi:ss') ts, 29 val from dual union all
select to_date('09.09.2012 08:12:04', 'dd.mm.yyyy hh24:mi:ss') ts, 30 val from dual union all
select to_date('09.09.2012 09:12:11', 'dd.mm.yyyy hh24:mi:ss') ts, 45 val from dual union all
select to_date('09.09.2012 10:12:12', 'dd.mm.yyyy hh24:mi:ss') ts, 60 val from dual union all
select to_date('09.09.2012 11:12:13', 'dd.mm.yyyy hh24:mi:ss') ts, 75 val from dual union all
select to_date('09.09.2012 12:21:24', 'dd.mm.yyyy hh24:mi:ss') ts, 95 val from dual union all
select to_date('09.09.2012 13:21:26', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
select to_date('09.09.2012 14:21:27', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('09.09.2012 15:21:30', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('09.09.2012 16:21:32', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('09.09.2012 17:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('09.09.2012 21:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('09.09.2012 23:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('10.09.2012 00:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('10.09.2012 01:03:08', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('10.09.2012 02:54:11', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('10.09.2012 03:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('10.09.2012 04:04:19', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('10.09.2012 05:04:20', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('10.09.2012 06:12:02', 'dd.mm.yyyy hh24:mi:ss') ts, 143 val from dual union all
select to_date('10.09.2012 07:12:03', 'dd.mm.yyyy hh24:mi:ss') ts, 144 val from dual union all
select to_date('10.09.2012 08:12:04', 'dd.mm.yyyy hh24:mi:ss') ts, 145 val from dual union all
select to_date('10.09.2012 09:12:11', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
select to_date('10.09.2012 10:12:12', 'dd.mm.yyyy hh24:mi:ss') ts, 147 val from dual union all
select to_date('10.09.2012 11:12:13', 'dd.mm.yyyy hh24:mi:ss') ts, 148 val from dual union all
select to_date('10.09.2012 12:21:24', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
select to_date('10.09.2012 13:21:26', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
select to_date('10.09.2012 14:21:27', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
select to_date('10.09.2012 15:21:30', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
select to_date('10.09.2012 16:21:32', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
select to_date('10.09.2012 21:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual)
, got_analytics AS
SELECT ts, val
, MIN (val) OVER ( PARTITION BY TRUNC (ts)
ORDER BY ts DESC
) AS min_val_after
, CASE
WHEN ROW_NUMBER () OVER ( PARTITION BY TRUNC (ts)
ORDER BY val
, ts
) = 1
THEN -1 -- Impossibly low val
ELSE LAG (val) OVER ( PARTITION BY TRUNC (ts)
ORDER BY ts
END AS prev_val
, MIN (val) OVER (PARTITION BY TRUNC (ts))
AS low_val_today
, NVL ( LAST_VALUE (val) OVER ( ORDER BY ts
RANGE BETWEEN UNBOUNDED PRECEDING
AND ts - TRUNC (ts) PRECEDING
, -1
) AS last_val_yesterday
FROM sampledata
SELECT ts
, val
FROM got_analytics
WHERE val <= min_val_after
AND val > prev_val
AND ( val > low_val_today
OR val != last_val_yesterday
ORDER BY ts
{code}
with the expected results:
{code}
1 09.09.2012 06:12:02 23
2 09.09.2012 07:12:03 29
3 09.09.2012 08:12:04 30
4 09.09.2012 09:12:11 45
5 09.09.2012 10:12:12 60
6 09.09.2012 11:12:13 75
7 09.09.2012 12:21:24 95
8 09.09.2012 13:21:26 120
9 09.09.2012 14:21:27 142
10 10.09.2012 06:12:02 143
11 10.09.2012 07:12:03 144
12 10.09.2012 08:12:04 145
13 10.09.2012 09:12:11 146
14 10.09.2012 10:12:12 147
15 10.09.2012 11:12:13 148
16 10.09.2012 12:21:24 153
{code}
And the sampledata for shift 3 is:
{code}
with sampledata as (select to_date('08.09.2012 00:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 23 val from dual union ALL
select to_date('08.09.2012 02:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 45 val from dual union all
select to_date('08.09.2012 05:03:08', 'dd.mm.yyyy hh24:mi:ss') ts, 78 val from dual union all
select to_date('08.09.2012 06:54:11', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
select to_date('08.09.2012 08:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
select to_date('08.09.2012 10:04:19', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
select to_date('08.09.2012 12:04:20', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
select to_date('08.09.2012 16:12:02', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
select to_date('08.09.2012 17:12:03', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
select to_date('08.09.2012 19:12:04', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
select to_date('08.09.2012 21:12:11', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
select to_date('08.09.2012 22:00:12', 'dd.mm.yyyy hh24:mi:ss') ts, 24 val from dual union all
select to_date('08.09.2012 22:12:13', 'dd.mm.yyyy hh24:mi:ss') ts, 40 val from dual union all
select to_date('08.09.2012 23:21:24', 'dd.mm.yyyy hh24:mi:ss') ts, 68 val from dual union all
select to_date('09.09.2012 01:03:08', 'dd.mm.yyyy hh24:mi:ss') ts, 79 val from dual union all
select to_date('09.09.2012 02:54:11', 'dd.mm.yyyy hh24:mi:ss') ts, 124 val from dual union all
select to_date('09.09.2012 03:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 125 val from dual union all
select to_date('09.09.2012 04:04:19', 'dd.mm.yyyy hh24:mi:ss') ts, 126 val from dual union all
select to_date('09.09.2012 05:04:20', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
select to_date('09.09.2012 06:12:02', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
select to_date('09.09.2012 07:12:03', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
select to_date('09.09.2012 08:12:04', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
select to_date('09.09.2012 09:12:11', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
select to_date('09.09.2012 10:12:12', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
select to_date('09.09.2012 11:12:13', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
select to_date('09.09.2012 12:21:24', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
select to_date('09.09.2012 13:21:26', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
select to_date('09.09.2012 14:21:27', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
select to_date('09.09.2012 15:21:30', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
select to_date('09.09.2012 16:21:32', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
select to_date('09.09.2012 17:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
select to_date('09.09.2012 21:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union ALL
select to_date('09.09.2012 22:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 5 val from dual union ALL
select to_date('09.09.2012 22:51:33', 'dd.mm.yyyy hh24:mi:ss') ts, 23 val from dual union all
select to_date('09.09.2012 23:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 40 val from dual union all
select to_date('10.09.2012 00:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 50 val from dual union all
select to_date('10.09.2012 01:03:08', 'dd.mm.yyyy hh24:mi:ss') ts, 60 val from dual union all
select to_date('10.09.2012 02:54:11', 'dd.mm.yyyy hh24:mi:ss') ts, 78 val from dual union all
select to_date('10.09.2012 03:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
select to_date('10.09.2012 04:04:19', 'dd.mm.yyyy hh24:mi:ss') ts, 145 val from dual union all
select to_date('10.09.2012 05:04:20', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
select to_date('10.09.2012 06:12:02', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
select to_date('10.09.2012 07:12:03', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
select to_date('10.09.2012 08:12:04', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
select to_date('10.09.2012 09:12:11', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
select to_date('10.09.2012 10:12:12', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
select to_date('10.09.2012 11:12:13', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
select to_date('10.09.2012 12:21:24', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
select to_date('10.09.2012 13:21:26', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
select to_date('10.09.2012 14:21:27', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
select to_date('10.09.2012 15:21:30', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
select to_date('10.09.2012 16:21:32', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
select to_date('10.09.2012 21:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual)
, got_analytics AS
SELECT ts, val
, MIN (val) OVER ( PARTITION BY TRUNC (ts)
ORDER BY ts DESC
) AS min_val_after
, CASE
WHEN ROW_NUMBER () OVER ( PARTITION BY TRUNC (ts)
ORDER BY val
, ts
) = 1
THEN -1 -- Impossibly low val
ELSE LAG (val) OVER ( PARTITION BY TRUNC (ts)
ORDER BY ts
END AS prev_val
, MIN (val) OVER (PARTITION BY TRUNC (ts))
AS low_val_today
, NVL ( LAST_VALUE (val) OVER ( ORDER BY ts
RANGE BETWEEN UNBOUNDED PRECEDING
AND ts - TRUNC (ts) PRECEDING
, -1
) AS last_val_yesterday
FROM sampledata
SELECT ts
, val
FROM got_analytics
WHERE val <= min_val_after
AND val > prev_val
AND ( val > low_val_today
OR val != last_val_yesterday
ORDER BY ts
{code}
with the unexpected results:
{code}
- ts val
1 08.09.2012 00:04:08 23
2 08.09.2012 22:12:13 40
3 08.09.2012 23:21:24 68
4 09.09.2012 22:21:33 5
5 09.09.2012 22:51:33 23
6 09.09.2012 23:21:33 40
7 10.09.2012 00:04:08 50
8 10.09.2012 01:03:08 60
9 10.09.2012 02:54:11 78
10 10.09.2012 03:04:08 142
11 10.09.2012 04:04:19 145
12 10.09.2012 05:04:20 146
{code}
The result should be:
{code}
- ts val
1 08.09.2012 00:04:08 23
2 08.09.2012 02:04:08 45
3 08.09.2012 05:03:08 78
4 08.09.2012 06:54:11 90
5 08.09.2012 22:00:12 24
6 08.09.2012 22:12:13 40
7 08.09.2012 23:21:24 68
8 09.09.2012 01:03:08 79
9 09.09.2012 02:54:11 124
10 09.09.2012 03:04:08 125
11 09.09.2012 04:04:19 126
12 09.09.2012 05:04:20 127
13 09.09.2012 22:21:33 5
14 09.09.2012 22:51:33 23
15 09.09.2012 23:21:33 40
16 10.09.2012 00:04:08 50
17 10.09.2012 01:03:08 60
18 10.09.2012 02:54:11 78
19 10.09.2012 03:04:08 142
20 10.09.2012 04:04:19 145
21 10.09.2012 05:04:20 146
{code}
Thank you for your help! -
Set top and bottom inset spacing values in Text Frame Options via jsx script
I am looking for a way to set the top and bottom inset spacing values only to 2 points in Text Frame Options via a .jsx scrpt.
For years, I have used a script that sets Preferences, such as:
with(app.storyPreferences){
opticalMarginAlignment = false;
opticalMarginSize = 12; // pts
I would like to add the code to this same script that would make Top = 0p2 and Bottom 0p2 but leave Left and Right as 0p0.
Any help would be greatly appreciated.Here is the full .jsx file that we now use to set preferences.
Ideally, this could be modified to include setting any text frame created to have 0p2 inset Top and Bottom, but 0p0 Left and Right:
//ApplicationTextDefaults
//An InDesign CS2 JavaScript
//Sets the application text defaults, which will become the text defaults for all
//new documents. Existing documents will remain unchanged.
with(app.textDefaults){
alignToBaseline = false; // align to baseline grid
try {
// appliedFont = app.fonts.item("Times New Roman");
appliedFont = app.fonts.item("Helvetica");
catch (e) {}
try {
fontStyle = "Medium";
catch (e) {}
autoleading = 100;
balanceRaggedLines = false;
baselineShift = 0;
capitalization = Capitalization.normal;
composer = "Adobe Paragraph Composer";
desiredGlyphScaling = 100;
desiredLetterSpacing = 0;
desiredWordSpacing = 100;
dropCapCharacters = 0;
if (dropCapCharacters != 0) {
dropCapLines = 3;
//Assumes that the application has a default character style named "myDropCap"
//dropCapStyle = app.characterStyles.item("myDropCap");
fillColor = app.colors.item("Black");
fillTint = 100;
firstLineIndent = "0pt";
// firstLineIndent = "14pt";
gridAlignFirstLineOnly = false;
horizontalScale = 100;
hyphenateAfterFirst = 3;
hyphenateBeforeLast = 4;
hyphenateCapitalizedWords = false;
hyphenateLadderLimit = 1;
hyphenateWordsLongerThan = 5;
hyphenation = true;
hyphenationZone = "3p";
hyphenWeight = 9;
justification = Justification.leftAlign;
keepAllLinesTogether = false;
keepLinesTogether = true;
keepFirstLines = 2;
keepLastLines = 2;
keepWithNext = 0;
kerningMethod = "Optical";
kerningValue = 0;
leading = 6.3;
// leading = 14;
leftIndent = 0;
ligatures = true;
maximumGlyphScaling = 100;
maximumLetterSpacing = 0;
maximumWordSpacing = 160;
minimumGlyphScaling = 100;
minimumLetterSpacing = 0;
minimumWordSpacing = 80;
noBreak = false;
otfContextualAlternate = true;
otfDiscretionaryLigature = true;
otfFigureStyle = OTFFigureStyle.proportionalOldstyle;
otfFraction = true;
otfHistorical = true;
otfOrdinal = false;
otfSlashedZero = true;
otfSwash = false;
otfTitling = false;
overprintFill = false;
overprintStroke = false;
pointSize = 6.3;
// pointSize = 11;
position = Position.normal;
rightIndent = 0;
ruleAbove = false;
if(ruleAbove == true){
ruleAboveColor = app.colors.item("Black");
ruleAboveGapColor = app.swatches.item("None");
ruleAboveGapOverprint = false;
ruleAboveGapTint = 100;
ruleAboveLeftIndent = 0;
ruleAboveLineWeight = .25;
ruleAboveOffset = 14;
ruleAboveOverprint = false;
ruleAboveRightIndent = 0;
ruleAboveTint = 100;
ruleAboveType = app.strokeStyles.item("Solid");
ruleAboveWidth = RuleWidth.columnWidth;
ruleBelow = false;
if(ruleBelow == true){
ruleBelowColor = app.colors.item("Black");
ruleBelowGapColor = app.swatches.item("None");
ruleBelowGapOverprint = false;
ruleBelowGapTint = 100;
ruleBelowLeftIndent = 0;
ruleBelowLineWeight = .25;
ruleBelowOffset = 0;
ruleBelowOverprint = false;
ruleBelowRightIndent = 0;
ruleBelowTint = 100;
ruleBelowType = app.strokeStyles.item("Solid");
ruleBelowWidth = RuleWidth.columnWidth;
singleWordJustification = SingleWordJustification.leftAlign;
skew = 0;
spaceAfter = 0;
spaceBefore = 0;
startParagraph = StartParagraph.anywhere;
strikeThru = false;
if(strikeThru == true){
strikeThroughColor = app.colors.item("Black");
strikeThroughGapColor = app.swatches.item("None");
strikeThroughGapOverprint = false;
strikeThroughGapTint = 100;
strikeThroughOffset = 3;
strikeThroughOverprint = false;
strikeThroughTint = 100;
strikeThroughType = app.strokeStyles.item("Solid");
strikeThroughWeight = .25;
strokeColor = app.swatches.item("None");
strokeTint = 100;
strokeWeight = 0;
tracking = 0;
underline = false;
if(underline == true){
underlineColor = app.colors.item("Black");
underlineGapColor = app.swatches.item("None");
underlineGapOverprint = false;
underlineGapTint = 100;
underlineOffset = 3;
underlineOverprint = false;
underlineTint = 100;
underlineType = app.strokeStyles.item("Solid");
underlineWeight = .25
verticalScale = 100;
//Units & Increments preference panel
//Must do this to make sure our units that we set are in points. The vert and horiz
//units that get set default to the current measurement unit. We set it to points
//so we can be sure of the value. We'll reset it later to the desired setting.
with(app.viewPreferences){
horizontalMeasurementUnits = MeasurementUnits.points; // Ruler Units, horizontal
verticalMeasurementUnits = MeasurementUnits.points; // Ruler Units, vertical
//General preference panel
with(app.generalPreferences){
pageNumbering = PageNumberingOptions.section; // Page Numbering, View
toolTips = ToolTipOptions.normal; // Tool Tips
// Not supported in CS4
// toolsPalette = ToolsPaletteOptions.doubleColumn; // Floating Tool Palette
completeFontDownloadGlyphLimit = 2000; // Always Subset Fonts...
try {
//Wrapped in try/catch in case it is run with CS4 and earlier to avoid the error
preventSelectingLockedItems = false; // Needed for CS5+
catch (e) {}
//Type preference panel
with (app.textEditingPreferences){
tripleClickSelectsLine = true; // Triple Click to Select a Line
smartCutAndPaste = true; // Adjust Spacing Automatically when Cutting and Pasting Words
dragAndDropTextInLayout = false; // Enable in Layout View
allowDragAndDropTextInStory = true; // Enable in Story Editor
with(app.textPreferences){
typographersQuotes = true; // Use Typographer's Quotes
useOpticalSize = true; // Automatically Use Correct Optical Size
scalingAdjustsText = true; // Adjust Text Attributes when Scaling
useParagraphLeading = false; // Apply Leading to Entire Paragraphs
linkTextFilesWhenImporting = false; // Create Links when Placing Text and Spreadsheet Files
// Missing following (Font Preview Size, Past All Information/Text Only)
//Advanced Type preference panel
with(app.textPreferences){
superscriptSize = 58.3; // Superscript, size
superscriptPosition = 33.3; // Superscript, position
subscriptSize = 58.3; // Subscript, size
subscriptPosition = 33.3; // Subscript, position
smallCap = 70; // Smallcap
with(app.imePreferences){
inlineInput = false; // Use Inline Input for Non-Latin Text
//Composition preference panel
with(app.textPreferences){
highlightKeeps = false; // Keep Violations
highlightHjViolations = false; // H&J Violations
highlightCustomSpacing = false; // Custom Tracking/Kerning
highlightSubstitutedFonts = true; // Substituted Fonts
highlightSubstitutedGlyphs = false; // Substituted Glyphs
justifyTextWraps = false; // Justify Text Next to an Object
abutTextToTextWrap = true; // Skip by Leading
zOrderTextWrap = false; // Text Wrap Only Affects Text Beneath
//Units & Increments preference panel
with(app.viewPreferences){
rulerOrigin = RulerOrigin.spreadOrigin; // Ruler Units, origin
// These are set at the end of the script after all the changes have been made
// horizontalMeasurementUnits = MeasurementUnits.points; // Ruler Units, horizontal
// verticalMeasurementUnits = MeasurementUnits.inches; // Ruler Units, vertical
pointsPerInch = 72; // Point/Pica Size, Points/Inch
cursorKeyIncrement = 1; // Keyboard Increment, Cursor Key
with(app.textPreferences){
baselineShiftKeyIncrement = 2; // Keyboard Increment, Baseline Shift
leadingKeyIncrement = 2; // Keyboard Increment, Size/Leading
kerningKeyIncrement = 20; // Keyboard Increment, Kerning
//Grids preference panel
with(app.gridPreferences){
baselineColor = UIColors.lightBlue; // Baseline Grid, Color
baselineStart = 48; // Baseline Grid, Start
baselineDivision = 6; // Baseline Grid, Increment Every
baselineViewThreshold = 50; // Baseline Grid, View Threshold
baselineGridRelativeOption = BaselineGridRelativeOption.topOfPageOfBaselineGridRelativeOption; // Baseline Grid, Relative To
gridColor = UIColors.lightGray; // Document Grid, Color
horizontalGridlineDivision = 12; // Document Grid, Horizontal, Gridline Every
horizontalGridSubdivision = 12; // Document Grid, Horizontal, Subdivisions
verticalGridlineDivision = 12; // Document Gird, Vertical, Gridline Every
verticalGridSubdivision = 12; // Document Grid, Vertical, Subdivisions
gridsInBack = true; // Grids in Back
documentGridSnapto = false; // snap to grid or not
documentGridShown = false; // show document grid
//Guides & Pasteboard preference panel
with(app.documentPreferences){
marginGuideColor = UIColors.violet; // Color, Margins
columnGuideColor = UIColors.magenta; // Color, Columns
with(app.pasteboardPreferences){
bleedGuideColor = UIColors.fiesta; // Color, Bleed
slugGuideColor = UIColors.gridBlue; // Color, Slug
previewBackgroundColor = UIColors.lightGray; // Color, Preview Background
minimumSpaceAboveAndBelow = 72; // Minimum Vertical Offset
with(app.viewPreferences){
guideSnaptoZone = 4; // Snap to Zone
with(app.guidePreferences){
guidesInBack = false; // Guides in Back
//Dictionary preference panel
with(app.dictionaryPreferences){
composition = ComposeUsing.both; // Hyphenatin Exceptions, Compose Using
mergeUserDictionary = false; // Merge User Dictionary into Document
recomposeWhenChanged = true; // Recompose All Stories When Modified
// Missing (Lang, Hyph, Spelling, Double Quotes, Single Quotes)
//Spelling preference panel
with(app.spellPreferences){
checkMisspelledWords = true; // Find, Misspelled Words
checkRepeatedWords = true; // Find, Repeated Words
checkCapitalizedWords = true; // Find, Uncapitalized Words
checkCapitalizedSentences = true; // Find, Uncapitalized Sentences
dynamicSpellCheck = true; // Enable Dynamic Spelling
misspelledWordColor = UIColors.red; // Color, Misspelled Words
repeatedWordColor = UIColors.green; // Color, Repeated Words
uncapitalizedWordColor = UIColors.green; // Color, Uncapitalized Words
uncapitalizedSentenceColor = UIColors.green; // Color, Uncapitalized Sentences
//Autocorrect preference panel
with(app.autoCorrectPreferences){
autoCorrect = true; // Enable Autocorrect
autoCorrectCapitalizationErrors = false; // Autocorrect Capitalization
// Missing (Language, Misspelled word pairs)
//Display Performance preference panel
with(app.displayPerformancePreferences){
defaultDisplaySettings = ViewDisplaySettings.typical; // Preserve Object-Level
persistLocalSettings = false;
// Missing (antialiasiing, greek below
//Story Editor Display preference panel
with(app.galleyPreferences){
textColor = InCopyUIColors.black; // Text Color
backgroundColor = InCopyUIColors.white; // Background
smoothText = true; // Enable Anti-Aliasing
antiAliasType = AntiAliasType.grayAntialiasing; // Type
cursorType = CursorTypes.standardCursor; // Cursor Type
blinkCursor = true; // Blink
// Missing (Font, Size, Line Spacing & Theme)
//File Handling preference panel
with(app.generalPreferences){
includePreview = true; // Always Save Preview Images with Doc
previewSize = PreviewSizeOptions.medium; // Preview Size
with(app.clipboardPreferences){
preferPDFWhenPasting = false; // Prefer PDF When Pasting
copyPDFToClipboard = true; // Copy PDF to Clipboard
preservePdfClipboardAtQuit = false; // Preserve PDF Data at Quit
// Missing (Enable Version Cue)
// Optical margin (hanging punctuation, outside margins)
with(app.storyPreferences){
opticalMarginAlignment = false;
opticalMarginSize = 12; // pts
//Wrap Up (do at end of script)
//Units & Increments preference panel
//Must do this to make sure our units that we set are in points. The vert and horiz
//units that get set default to the current measurement unit. We set it to points
//so we can be sure of the value. We'll reset it later to the desired setting.
with(app.viewPreferences){
horizontalMeasurementUnits = MeasurementUnits.picas; // Ruler Units, horizontal
verticalMeasurementUnits = MeasurementUnits.inches; // Ruler Units, vertical
// These two flags are turned off to avoid the error message about
// missing image links when InDesign opens an ad. This can especially
// be a problem when doing batch processes.
with(app.linkingPreferences){
checkLinksAtOpen = false; // checkbox: true/false
findMissingLinksAtOpen = false; // checkbox: true/false -
how to obtain all values in field range of domain?
Hello,
YOu can find those values in table DD07L
Regards
Naimesh -
How can i pass a value over to a TitleWindow
Okies as i read from the documentation given by adobe itself
... I have only seen examples on how to parse value over to the
Main application from the title window but not the Main Application
to the title window.
What iam trying to do is i wanted to get a selected item from
a tile list but i wanted to pass it to a swfLoader in a TitleWindow
Component so that it knows which SWF(in my case ) to search for
within the specific destination folder.
What i have tried is to declare a variable to store the
selectedItem
example would be ..
main_application
public var selectedItem:String =
"{tileList.selectedItem.Image}";
titlewindow
source=
"assets/data_games/{Application.application.selectedItem}.swf"
but it seems to be a wrong way to call it ..
could anyone help me out on how to solve my following
problems?First this forum is intended for questions about Flex
Builder. Flex language questions should be posted in the Flex
General Discussion forum.
That is generally right, but I would build the swf source
string in a variable first so you can debug it. Do that in a
handler function on the creationComplete event of the TitleWindow
Tracy -
The values will be read from a digital measurement device, so there shouldn't be much of a problem in recognizing the fonts, I hope.
Ideally, it'd function like this: Webcam is trained on device, and without interaction from the user, labview processes the current data every 30 seconds, converts it to a string, and averages 30-second values over a period of 5 minutes. The average is then given to a VI that would normally poll the user for a value every 5 minutes. I'm trying to close the loop on a control system, and think this is the best way to do it.If I understand this correctly, you would like to take a picture of a screen every 30 seconds and read some text on that screen. You can do that using the NI Vision Development Module's Optical Character Recognition function palette. This will allow you to create a set of images of each character that you expect to read. You can then train the software to assositate an image of that character with the actual digital character. In this way, the software will be able to read the text. You can read more about this concept in the NI Vision Concepts Help 2013 manual.
Is there any other way to transfer the information from the first computer to the second computer other than using a camera to read the text off a screen? That seems like a strange way to transfer data, but it's possible that you don't have access to the code on the first computer and you can only access it from the front panel. Is that the situation that you are in?
Take care,
Jeremy P.
Applications Engineer
National Instruments -
I would like to know how i can create a bell graph with out using sub VIs, the data that i created consists in 500 readings with values of 0 to 100, i calculated the mean value and standard diviation. I hope some one can help me
Here's a quick example I threw together that generates a sort-of-bell-curve shaped data distribution, then performs the binning and plotting.
-Kevin P.
Message Edited by Kevin Price on 12-01-2006 02:42 PM
Attachments:
Binning example.vi 51 KB
Binning example.png 12 KB -
Assigning value to a range dynamically
Hello All,
I want to assign the values to a range dynamically.
For example in the following piece of code, I cannot directly assign wa-kunnr to
r_kunnr-low.
LOOP AT ITAB INTO WA.
r_kunnr-sign = 'I'.
r_kunnr-option = 'EQ'.
r_kunnr-low = wa-kunnr.
APPEND r_kunnr.
ENDLOOP.
Can this be done dynamically? If I use following code I am getting a short dump.
lv_fnam = 'wa-kunnr'.
LOOP AT ITAB INTO WA.
r_kunnr-sign = 'I'.
r_kunnr-option = 'EQ'.
r_kunnr-low = ( lv_fnam ).
APPEND r_kunnr.
ENDLOOP.
Could anyone please suggest how to do this?
Regards
IndrajitHi,
Try giving it capital letters
lv_fnam = <b>'WA-KUNNR'</b>.
LOOP AT ITAB INTO WA.
r_kunnr-sign = 'I'.
r_kunnr-option = 'EQ'.
r_kunnr-low = ( lv_fnam ).
APPEND r_kunnr.
ENDLOOP.
Thanks
Naren -
In a purchase order for 3 GR's Quantity does not equal the value
Hi,
I have found the difference in one purchase order, there have been 3 GRs where the Qty does not equal the Value. How has this happened?
Any guesses why the difference has come.
Thanks®ards,
VeenaHi Vishal,
In po history for agt 3 Gr's Quantity does not equal the values. I hope the difference is Movement types. But exactly where to check this movements i am not getting.
Can you tell me this which T.code.
Tx, -
Octroi is not calculating on Freight value
Hello,
We are maintained all the conditions in PO conditions tab. Inclucding of Frieght conditions.
here Octroi is not calculating on Freight value.
If suppose the frieght vendor is different then it is ok to calculted Octri without frieght,
If suppose the Frieght vendor is same as Main vendor then Octri should calculte on including of fright value.
Regards
Mahesh NaikHello,
In pricing schema We have given the 361 as ALT CBV to octroi condition (JOCM), to calculate the octroi percentage on Octroi Base value that is basic Price excise dutysales tax.
As per requirement 361, system should fetch the octroi base value from taxation procedure to pricing schema
Regards
Mahesh Naik.
Maybe you are looking for
-
Query mode of Master Detail Form
I have a master-detail form. I'm passing parameters to that form to autoquery data. Everything works fine but "Next" button in a detail block. I'm getting: An unexpected error occurred: ORA-06502: PL/SQL: numeric or value error: NULL index table key
-
SSRS report using Access database in SharePoint.
Dear All, I have created SSRS report using Access database. I have successfully deploy SSRS report in SharePoint document library. When I am opening the same report I am getting following error. Can anyone please help me how to resolve this issue ? E
-
CONVERSION_EXIT_ALPHA_INPUT problem
Hi, I am adding leading zeros to a number using the CONVERSION_EXIT_ALPHA_INPUT Function module. But i am not getting what i require in case of a negative number. Eg: I am getting perfectly correct result in case of postive numbers. input = 1500 outp
-
Can i buy an upgrade from CS5 design prenium windows to CS6 design prenium Mac ?
Hello, As i said in the title, i purchased Student Edition of CS5 Design Premium for student and i'd like to upgrade it to the same edition to CS6 for Mac. is it possible ? I don't find where i can do that in the adobe's website. I've bought it on it
-
Powerbook starts up. then shuts down after a few seconds
Help! My powerbook starts fine boots the system. gets as far as when the system shows the right hand dropdown menus then blacks out. some times the screen flickers first. It then will not start again unless I reset the PMU. then It starts fine only t