Date Being Returned Is 1-Jan-4000
Hi.
I am using Oracle 10gR2 on a Solaris server. I've set up Oracle's Change Data Capture (CDC) and am noticing thta the date column in my chnage table is set to 1-Jan-4000 when, in fact, the actual data from my source table is 5-Mar-2007. Can someone plesae tell me why this is happening?
Thank you.
Elie
From the java.io.File API:
A long value representing the time the file was last modified, measured in milliseconds since the
epoch (00:00:00 GMT, January 1, 1970), or 0L if the file does not exist or if an I/O error occursFor whatever reason, the fact that you are getting time 0 ( = midnight 1/1/1970, which with your timezone adjustment may be sometime on 12/31/1969) means that no file for your specified pathname exists, or some I/O error prevented access to the file.
Similar Messages
-
Data service returning an empty object
Hi
I am trying to call a data service from the flex to load my
data grid. In the debug, i saw that the arrraycollection in the
fill method is being returned empty. here are the details....
My .java file has the same names with the set and get
functions which are set by my java assembler class.
I am calling other fill methods and they seems to be working
fine except this one....The problem was that the name of the destination
"codecoverage" was same as that of one of the packages where my
.java file was there. Changed this name and things started to
work..:-) -
Explain plan cardinallity is way off compared to actual rows being returned
Database version 11.2.0.3
We have a small but rapidly growing datawarehouse which has OBIEE as its front end reporting tool. Our DBA has set up a automatic stats gathering method in OEM and we can see that it run and gathers stats on stale objects on a regular basis. So we know the statistics are upto date.
In checking some slow queries I can see that the cardinality being reported in explain plans is way off compared to what is actually being returned.
For example the actual number of rows returned are 8000 but the cardinality estimate is > 300,000.
Now as per an Oracle White paper(The Oracle Optimizer Explain the Explain Plan) having "multiple single column predicates on a single table" can affect cardinality estimates and in case of our query that is true. Here is the "WHERE Clause section" of the query
SQL> select D1.c1 as c1,
2 D1.c2 as c2,
3 D1.c3 as c3,
4 D1.c4 as c4,
5 D1.c5 as c5,
6 D1.c6 as c6,
7 D1.c7 as c7,
8 D1.c8 as c8,
9 D1.c9 as c9,
10 D1.c10 as c10,
11 D1.c11 as c11,
12 D1.c12 as c12,
13 D1.c13 as c13,
14 D1.c14 as c14,
15 D1.c15 as c15,
16 D1.c16 as c16
17 from (select D1.c4 as c1,
18 D1.c5 as c2,
19 D1.c3 as c3,
20 D1.c1 as c4,
21 D1.c6 as c5,
22 D1.c7 as c6,
23 D1.c2 as c7,
24 D1.c8 as c8,
25 D1.c9 as c9,
26 D1.c10 as c10,
27 D1.c9 as c11,
28 D1.c11 as c12,
29 D1.c2 as c13,
30 D1.c2 as c14,
31 D1.c12 as c15,
32 'XYZ' as c16,
33 ROW_NUMBER() OVER(PARTITION BY D1.c2, D1.c3, D1.c4, D1.c5, D1.c6, D1.c7, D1.c8, D1.c9, D1.c10, D1.c11, D1.c12 ORDER BY D1.c2 ASC, D1.c3 ASC, D1.c4 ASC, D1.c5 ASC, D1.c6 ASC, D1.c
ASC, D1.c8 ASC, D1.c9 ASC, D1.c10 ASC, D1.c11 ASC, D1.c12 ASC) as c17
34 from (select distinct D1.c1 as c1,
35 D1.c2 as c2,
36 'CHANNEL1' as c3,
37 D1.c3 as c4,
38 D1.c4 as c5,
39 D1.c5 as c6,
40 D1.c6 as c7,
41 D1.c7 as c8,
42 D1.c8 as c9,
43 D1.c9 as c10,
44 D1.c10 as c11,
45 D1.c11 as c12
46 from (select sum(T610543.GLOBAL1_EXCHANGE_RATE * case
47 when T610543.X_ZEB_SYNC_EBS_FLG = 'Y' then
48 T610543.X_ZEB_AIA_U_REVN_AMT
49 else
50 0
51 end) as c1,
52 T536086.X_ZEBRA_TERRITORY as c2,
53 T526821.LEVEL9_NAME as c3,
54 T526821.LEVEL1_NAME as c4,
55 T577698.PER_NAME_FSCL_YEAR as c5,
56 T577698.FSCL_QTR as c6,
57 T31796.X_ZEBRA_TERRITORY as c7,
58 T31796.X_OU_NUM as c8,
59 T664055.TERRITORY as c9,
60 T536086.X_OU_NUM as c10,
61 T526821.LEVEL4_NAME as c11
62 from W_INT_ORG_D T613144 /* Dim_ZEB_W_INT_ORG_D_POS_Client_Attr_Direct */,
63 W_ZEBRA_REGION_D T664055 /* Dim_ZEB_W_ZEBRA_REGION_D_POS_Client_Direct */,
64 W_DAY_D T577698 /* Dim_ZEB_W_DAY_D_Order_Invoice_Date */,
65 WC_PRODUCT_HIER_DH T526821 /* Dim_WC_PRODUCT_HIER_DH */,
66 W_PRODUCT_D T32069 /* Dim_W_PRODUCT_D */,
67 W_ORG_D T31796,
68 W_ORG_D T536086 /* Dim_ZEB_W_ORG_D_Reseller */,
69 W_ORDERITEM_TMP_F T610543 /* Fact_ZEB_W_ORDERITEM_F_Direct */
70 where (T610543.PR_OWNER_BU_WID = T613144.ROW_WID and
71 T577698.ROW_WID =
72 T610543.X_ZEB_AIA_TRXN_DT_WID and
73 T32069.ROW_WID = T526821.PROD_WID and
74 T32069.ROW_WID = T610543.ROOT_LN_PROD_WID and
75 T536086.ROW_WID = T610543.ACCNT_WID and
76 T31796.DATASOURCE_NUM_ID =
77 T610543.DATASOURCE_NUM_ID and
78 T31796.INTEGRATION_ID = T610543.VIS_PR_BU_ID and
79 T536086.DELETE_FLG = 'N' and
80 T610543.X_ZEB_DELETE_FLG = 'N' and
81 T613144.X_ZEB_REGION_WID = T664055.ROW_WID and
82 T577698.FSCL_DAY_OF_YEAR < 97 and
83 '2006' < T577698.PER_NAME_FSCL_YEAR and
84 T536086.X_OU_NUM <> '11073' and
85 T536086.X_ZEBRA_TERRITORY !=
86 'XX23' and
87 T536086.X_OU_NUM != '56791647728774' and
88 T536086.X_OU_NUM != '245395890' and
89 T536086.X_ZEBRA_TERRITORY !=
90 'STRATEGIC ACCTS 2' and
91 T526821.LEVEL2_NAME != 'Charges' and
92 T526821.LEVEL9_NAME != 'Unspecified' and
93 T536086.X_ZEBRA_TERRITORY !=
94 'XX1' and T536086.X_ZEBRA_TERRITORY !=
95 'XX2' and T536086.X_ZEBRA_TERRITORY !=
96 'XX3' and T536086.X_ZEBRA_TERRITORY !=
97 'XX4' and
98 (T536086.X_ZEBRA_TERRITORY in
99 ( ... In List of 22 values )) and
125 T32069.X_ZEB_EBS_PRODUCT_TYPE is null)
126 group by T31796.X_ZEBRA_TERRITORY,
127 T31796.X_OU_NUM,
128 T526821.LEVEL1_NAME,
129 T526821.LEVEL4_NAME,
130 T526821.LEVEL9_NAME,
131 T536086.X_OU_NUM,
132 T536086.X_ZEBRA_TERRITORY,
133 T577698.FSCL_QTR,
134 T577698.PER_NAME_FSCL_YEAR,
135 T664055.TERRITORY) D1) D1) D1
136 where (D1.c17 = 1)
137 /
Elapsed: 00:00:35.19
Execution Plan
Plan hash value: 3285002974
| Id | Operation | Name | Rows | Bytes |TempSpc| Cost (%CPU)| Time | Pstart| Pstop | TQ |IN-OUT| PQ Distrib |
| 0 | SELECT STATEMENT | | 2145M| 2123G| | 612K (1)| 03:03:47 | | | | | |
| 1 | PX COORDINATOR | | | | | | | | | | | |
| 2 | PX SEND QC (RANDOM) | :TQ10012 | 2145M| 2123G| | 612K (1)| 03:03:47 | | | Q1,12 | P->S | QC (RAND) |
|* 3 | VIEW | | 2145M| 2123G| | 612K (1)| 03:03:47 | | | Q1,12 | PCWP | |
|* 4 | WINDOW NOSORT | | 2145M| 421G| | 612K (1)| 03:03:47 | | | Q1,12 | PCWP | |
| 5 | SORT GROUP BY | | 2145M| 421G| 448G| 612K (1)| 03:03:47 | | | Q1,12 | PCWP | |
| 6 | PX RECEIVE | | 2145M| 421G| | 1740 (11)| 00:00:32 | | | Q1,12 | PCWP | |
| 7 | PX SEND HASH | :TQ10011 | 2145M| 421G| | 1740 (11)| 00:00:32 | | | Q1,11 | P->P | HASH |
|* 8 | HASH JOIN BUFFERED | | 2145M| 421G| | 1740 (11)| 00:00:32 | | | Q1,11 | PCWP | |
| 9 | PX RECEIVE | | 268K| 7864K| | 93 (2)| 00:00:02 | | | Q1,11 | PCWP | |
| 10 | PX SEND HASH | :TQ10009 | 268K| 7864K| | 93 (2)| 00:00:02 | | | Q1,09 | P->P | HASH |
| 11 | PX BLOCK ITERATOR | | 268K| 7864K| | 93 (2)| 00:00:02 | | | Q1,09 | PCWC | |
| 12 | TABLE ACCESS FULL | W_ORG_D | 268K| 7864K| | 93 (2)| 00:00:02 | | | Q1,09 | PCWP | |
| 13 | PX RECEIVE | | 345K| 59M| | 1491 (2)| 00:00:27 | | | Q1,11 | PCWP | |
| 14 | PX SEND HASH | :TQ10010 | 345K| 59M| | 1491 (2)| 00:00:27 | | | Q1,10 | P->P | HASH |
|* 15 | HASH JOIN BUFFERED | | 345K| 59M| | 1491 (2)| 00:00:27 | | | Q1,10 | PCWP | |
| 16 | PX RECEIVE | | 1321 | 30383 | | 2 (0)| 00:00:01 | | | Q1,10 | PCWP | |
| 17 | PX SEND BROADCAST | :TQ10006 | 1321 | 30383 | | 2 (0)| 00:00:01 | | | Q1,06 | P->P | BROADCAST |
| 18 | PX BLOCK ITERATOR | | 1321 | 30383 | | 2 (0)| 00:00:01 | | | Q1,06 | PCWC | |
| 19 | TABLE ACCESS FULL | W_ZEBRA_REGION_D | 1321 | 30383 | | 2 (0)| 00:00:01 | | | Q1,06 | PCWP | |
|* 20 | HASH JOIN | | 345K| 52M| | 1488 (2)| 00:00:27 | | | Q1,10 | PCWP | |
| 21 | JOIN FILTER CREATE | :BF0000 | 9740 | 114K| | 2 (0)| 00:00:01 | | | Q1,10 | PCWP | |
| 22 | PX RECEIVE | | 9740 | 114K| | 2 (0)| 00:00:01 | | | Q1,10 | PCWP | |
| 23 | PX SEND HASH | :TQ10007 | 9740 | 114K| | 2 (0)| 00:00:01 | | | Q1,07 | P->P | HASH |
| 24 | PX BLOCK ITERATOR | | 9740 | 114K| | 2 (0)| 00:00:01 | | | Q1,07 | PCWC | |
| 25 | TABLE ACCESS FULL | W_INT_ORG_D | 9740 | 114K| | 2 (0)| 00:00:01 | | | Q1,07 | PCWP | |
| 26 | PX RECEIVE | | 344K| 47M| | 1486 (2)| 00:00:27 | | | Q1,10 | PCWP | |
| 27 | PX SEND HASH | :TQ10008 | 344K| 47M| | 1486 (2)| 00:00:27 | | | Q1,08 | P->P | HASH |
| 28 | JOIN FILTER USE | :BF0000 | 344K| 47M| | 1486 (2)| 00:00:27 | | | Q1,08 | PCWP | |
|* 29 | HASH JOIN BUFFERED | | 344K| 47M| | 1486 (2)| 00:00:27 | | | Q1,08 | PCWP | |
| 30 | JOIN FILTER CREATE | :BF0001 | 35290 | 964K| | 93 (2)| 00:00:02 | | | Q1,08 | PCWP | |
| 31 | PX RECEIVE | | 35290 | 964K| | 93 (2)| 00:00:02 | | | Q1,08 | PCWP | |
| 32 | PX SEND HASH | :TQ10004 | 35290 | 964K| | 93 (2)| 00:00:02 | | | Q1,04 | P->P | HASH |
| 33 | PX BLOCK ITERATOR | | 35290 | 964K| | 93 (2)| 00:00:02 | | | Q1,04 | PCWC | |
|* 34 | TABLE ACCESS FULL | W_ORG_D | 35290 | 964K| | 93 (2)| 00:00:02 | | | Q1,04 | PCWP | |
| 35 | PX RECEIVE | | 344K| 38M| | 1392 (2)| 00:00:26 | | | Q1,08 | PCWP | |
| 36 | PX SEND HASH | :TQ10005 | 344K| 38M| | 1392 (2)| 00:00:26 | | | Q1,05 | P->P | HASH |
| 37 | JOIN FILTER USE | :BF0001 | 344K| 38M| | 1392 (2)| 00:00:26 | | | Q1,05 | PCWP | |
|* 38 | HASH JOIN BUFFERED | | 344K| 38M| | 1392 (2)| 00:00:26 | | | Q1,05 | PCWP | |
| 39 | PX RECEIVE | | 93791 | 4671K| | 7 (0)| 00:00:01 | | | Q1,05 | PCWP | |
| 40 | PX SEND HASH | :TQ10001 | 93791 | 4671K| | 7 (0)| 00:00:01 | | | Q1,01 | P->P | HASH |
| 41 | PX BLOCK ITERATOR | | 93791 | 4671K| | 7 (0)| 00:00:01 | | | Q1,01 | PCWC | |
|* 42 | TABLE ACCESS FULL | WC_PRODUCT_HIER_DH | 93791 | 4671K| | 7 (0)| 00:00:01 | | | Q1,01 | PCWP | |
|* 43 | HASH JOIN | | 894K| 57M| | 1384 (2)| 00:00:25 | | | Q1,05 | PCWP | |
| 44 | JOIN FILTER CREATE | :BF0002 | 243K| 1904K| | 48 (3)| 00:00:01 | | | Q1,05 | PCWP | |
| 45 | PX RECEIVE | | 243K| 1904K| | 48 (3)| 00:00:01 | | | Q1,05 | PCWP | |
| 46 | PX SEND HASH | :TQ10002 | 243K| 1904K| | 48 (3)| 00:00:01 | | | Q1,02 | P->P | HASH |
| 47 | PX BLOCK ITERATOR | | 243K| 1904K| | 48 (3)| 00:00:01 | | | Q1,02 | PCWC | |
|* 48 | TABLE ACCESS FULL | W_PRODUCT_D | 243K| 1904K| | 48 (3)| 00:00:01 | | | Q1,02 | PCWP | |
| 49 | PX RECEIVE | | 894K| 50M| | 1336 (2)| 00:00:25 | | | Q1,05 | PCWP | |
| 50 | PX SEND HASH | :TQ10003 | 894K| 50M| | 1336 (2)| 00:00:25 | | | Q1,03 | P->P | HASH |
| 51 | JOIN FILTER USE | :BF0002 | 894K| 50M| | 1336 (2)| 00:00:25 | | | Q1,03 | PCWP | |
|* 52 | HASH JOIN | | 894K| 50M| | 1336 (2)| 00:00:25 | | | Q1,03 | PCWP | |
| 53 | PX RECEIVE | | 292 | 3504 | | 136 (0)| 00:00:03 | | | Q1,03 | PCWP | |
| 54 | PX SEND BROADCAST LOCAL| :TQ10000 | 292 | 3504 | | 136 (0)| 00:00:03 | | | Q1,00 | P->P | BCST LOCAL |
| 55 | PX BLOCK ITERATOR | | 292 | 3504 | | 136 (0)| 00:00:03 | | | Q1,00 | PCWC | |
|* 56 | TABLE ACCESS FULL | W_DAY_D | 292 | 3504 | | 136 (0)| 00:00:03 | | | Q1,00 | PCWP | |
| 57 | PX BLOCK ITERATOR | | 4801K| 215M| | 1199 (2)| 00:00:22 | 1 | 11 | Q1,03 | PCWC | |
|* 58 | TABLE ACCESS FULL | W_ORDERITEM_TMP_F | 4801K| 215M| | 1199 (2)| 00:00:22 | 1 | 44 | Q1,03 | PCWP | |
Note
- dynamic sampling used for this statement (level=5)
Statistics
498 recursive calls
2046 db block gets
1193630 consistent gets
74398 physical reads
0 redo size
655170 bytes sent via SQL*Net to client
11761 bytes received via SQL*Net from client
541 SQL*Net roundtrips to/from client
64 sorts (memory)
0 sorts (disk)
8090 rows processed
SQL>So my question is if, cardinality estimates are way off, is that an indicator that the explain plans being generated are sub-optimal?
Can you provide me with some tips or links to blog post or books on how I approach tuning such queries where cardinalities are not good?
Edited by: qqq on Apr 7, 2013 2:27 PMAs already asked in your other thread:
Please see the FAQ for how to post a tuning request and the information that you need to provide.
Part of that information is:
1. DDL for the table and indexes
2. The query being used
3. row counts for the table and for the predicates used in the query
4. info about stats. You did update the table and index stats didn't you?
5. The 'actual' execution plans.
An explain plan just shows what Oracle 'thinks' it is going to do. The actual plans show what Oracle actually 'did' do. Just because Oracle expected to save doesn't mean the savings were actually achieved.
When you post the plans use on the line before and on the line after to preserve formatting.
Your partial code is virtually unusable because of the missing conditions in the predicates. You need to use '!=' for 'not equals' if that's what those missing conditions are.
Please edit your post to use code tags, add the missing conditions and provide the other information needed for a tuning request. -
Can weblogic client takes empty arrays being returned.
Hi All:
I am using weblogic 9.2 to generate the web service client code.
Here is my build.xml file to build the java client code:
>
<project name="webservices-TEST" default="client">
<taskdef name="clientgen" classname="weblogic.wsee.tools.anttasks.ClientGenTask" />
<target name="client" >
<clientgen
wsdl=" http://test:7035/TEST-engine/services/accumulator.wsdl"
destDir="clientclasses"
packageName="com.TEST.TEST2.caps.wsclient"
>
</clientgen>
</target>
</project>
Here is my client code to test the web service.
>
public class testWebServicesClient
public static void main(String[] test)
try {
Long recipientId = 11597997730620L;
AccumulatorWS impl = new AccumulatorWS_Impl("http://test:7035/TEST-engine/services/accumulator.wsdl");
IAccumulatorWS iAccWS1 = impl.getIAccumulatorWS();
AccumulatorResponse response = iAccWS1.getCurrentAccumulators(recipientId, "", "", "", "en", "");
System.err.println("The status is: " + response.getStatus());
catch (Exception e) {
e.printStackTrace();
I get the following error:
java.rmi.RemoteException: Illegal Capacity: -1; nested exception is:
java.lang.IllegalArgumentException: Illegal Capacity: -1
at com.test.wsclient.IAccumulatorWS_Stub.getCurrentAccumulators(IAccumulatorWS_Stub.java:46)
at test.testWebServicesClient.main(testWebServicesClient.java:27)
Caused by: java.lang.IllegalArgumentException: Illegal Capacity: -1
at java.util.ArrayList.<init>(ArrayList.java:111)
at com.bea.staxb.runtime.internal.util.collections.ArrayListBasedObjectAccumulator.createNewStore(ArrayListBasedObjectAccumulator.java:42)
at com.bea.staxb.runtime.internal.util.collections.ObjectAccumulator.<init>(ObjectAccumulator.java:39)
at com.bea.staxb.runtime.internal.util.collections.ArrayListBasedObjectAccumulator.<init>(ArrayListBasedObjectAccumulator.java:31)
at com.bea.staxb.runtime.internal.util.collections.AccumulatorFactory.createAccumulator(AccumulatorFactory.java:37)
at com.bea.staxb.runtime.internal.util.collections.AccumulatorFactory.createAccumulator(AccumulatorFactory.java:74)
at com.bea.staxb.runtime.internal.SoapArrayRuntimeBindingType.createIntermediary(SoapArrayRuntimeBindingType.java:255)
at com.bea.staxb.runtime.internal.RuntimeBindingProperty.createIntermediary(RuntimeBindingProperty.java:117)
at com.bea.staxb.runtime.internal.SoapUnmarshalResult.unmarshalElementProperty(SoapUnmarshalResult.java:364)
at com.bea.staxb.runtime.internal.SoapUnmarshalResult.basicExtractAndFill(SoapUnmarshalResult.java:241)
at com.bea.staxb.runtime.internal.SoapUnmarshalResult.extractAndFillElementProp(SoapUnmarshalResult.java:174)
at com.bea.staxb.runtime.internal.ByNameUnmarshaller.deserializeContents(ByNameUnmarshaller.java:51)
at com.bea.staxb.runtime.internal.AttributeUnmarshaller.unmarshalIntoIntermediary(AttributeUnmarshaller.java:47)
at com.bea.staxb.runtime.internal.SoapUnmarshalResult.umarshalComplexElementWithId(SoapUnmarshalResult.java:395)
at com.bea.staxb.runtime.internal.SoapUnmarshalResult.unmarshalElementProperty(SoapUnmarshalResult.java:366)
at com.bea.staxb.runtime.internal.SoapUnmarshalResult.basicExtractAndFill(SoapUnmarshalResult.java:241)
at com.bea.staxb.runtime.internal.SoapUnmarshalResult.extractAndFillElementProp(SoapUnmarshalResult.java:174)
at com.bea.staxb.runtime.internal.ByNameUnmarshaller.deserializeContents(ByNameUnmarshaller.java:51)
at com.bea.staxb.runtime.internal.AttributeUnmarshaller.unmarshal(AttributeUnmarshaller.java:38)
at com.bea.staxb.runtime.internal.SoapUnmarshalResult.unmarshalBindingType(SoapUnmarshalResult.java:110)
at com.bea.staxb.runtime.internal.UnmarshalResult.unmarshalType(UnmarshalResult.java:212)
at com.bea.staxb.runtime.internal.SoapUnmarshallerImpl.unmarshalType(SoapUnmarshallerImpl.java:93)
at weblogic.wsee.bind.runtime.internal.EncodedDeserializerContext.unmarshalType(EncodedDeserializerContext.java:66)
at weblogic.wsee.bind.runtime.internal.BaseDeserializerContext.internalDeserializeType(BaseDeserializerContext.java:170)
at weblogic.wsee.bind.runtime.internal.BaseDeserializerContext.deserializeType(BaseDeserializerContext.java:87)
at weblogic.wsee.codec.soap11.SoapDecoder.decodePart(SoapDecoder.java:401)
at weblogic.wsee.codec.soap11.SoapDecoder.decodeReturn(SoapDecoder.java:316)
at weblogic.wsee.codec.soap11.SoapDecoder.decodeParts(SoapDecoder.java:165)
at weblogic.wsee.codec.soap11.SoapDecoder.decode(SoapDecoder.java:116)
at weblogic.wsee.codec.soap11.SoapCodec.decode(SoapCodec.java:136)
at weblogic.wsee.ws.dispatch.client.CodecHandler.decodeOutput(CodecHandler.java:117)
at weblogic.wsee.ws.dispatch.client.CodecHandler.decode(CodecHandler.java:94)
at weblogic.wsee.ws.dispatch.client.CodecHandler.handleResponse(CodecHandler.java:71)
at weblogic.wsee.handler.HandlerIterator.handleResponse(HandlerIterator.java:242)
at weblogic.wsee.handler.HandlerIterator.handleResponse(HandlerIterator.java:226)
at weblogic.wsee.ws.dispatch.client.ClientDispatcher.handleResponse(ClientDispatcher.java:161)
at weblogic.wsee.ws.dispatch.client.ClientDispatcher.dispatch(ClientDispatcher.java:116)
at weblogic.wsee.ws.WsStub.invoke(WsStub.java:89)
at weblogic.wsee.jaxrpc.StubImpl._invoke(StubImpl.java:335)
at com.test.IAccumulatorWS_Stub.getCurrentAccumulators(IAccumulatorWS_Stub.java:37)
... 1 more
If I changed from
Long recipientId = 11597997730620L;
to
Long recipientId = 14998712L;
I get the error message The status is: 4.
The group that generates the web service state that the weblogic client code is choking
on empty arrays being returned. They say they have to return empty arrays
instead of nulls where no data is returned.
Is there something wrong with the weblogic tools
or am I using it incorrectly?
Yours,
Frustrated.you'd have to check your generated code.
If it can't handle empty arrays that would bring my trust in Bea's code down another notch (and from what I've experienced indirectly of their WS stack I wasn't all that thrilled). -
Data being truncated in output - even though field large enough
I have a rdf report. I see that one of the columns is not displayed completely - even though it is big enough to accomodate the data that is being returned from the query. I increased the size of the field in the designer - still the data is being truncated. What could be going on here?
beside the enlargment of the visible area of the column in the layout editor, You should also increase the width property of that column from the property inspector.
Regards
Mostafa -
Dialog programming, data being washed out in TAB Control
Hi,
i am working on dialog programming , in which i am using Table Control for user input (data is not coming from database table) . everything is going well till assignment of data to internal table but when
control goes to PBO by any means like pressing ENTER etc. then data being washed out.
PROCESS BEFORE OUTPUT.
MODULE TC_CONTROL.
LOOP AT it_data
INTO wa_data
WITH CONTROL tc_control
CURSOR tc_control-current_line.
MODULE tc_control_get_lines.
ENDLOOP.
PROCESS AFTER INPUT.
LOOP AT IT_DATA.
CHAIN.
FIELD WA_DATA-FREPS_N.
FIELD wa_data-TOEPS_N.
FIELD wa_data-PRCH_A.
FIELD wa_data-SRVC_AMT .
FIELD wa_data-ACCNT_C.
FIELD wa_data-AMT_D.
FIELD wa_data-NARR_X.
FIELD wa_data-CRPRD_N.
MODULE tc_control_modify ON CHAIN-REQUEST.
ENDCHAIN.
ENDLOOP.
*Abap program
MODULE TC_CONTROL OUTPUT.
DESCRIBE TABLE it_data LINES tc_control-lines.
ENDMODULE. " TC_CONTROL OUTPUT
CONTROLS: TC_CONTROL TYPE TABLEVIEW USING SCREEN 1000,
TC_CONTROL1 TYPE TABLEVIEW USING SCREEN 1000.
DATA: G_TC_CONTROLS_LINES LIKE SY-LOOPC,
G_TC_CONTROLS_LINES1 LIKE SY-LOOPC.
*& Module tc_control_get_lines OUTPUT
text
MODULE tc_control_get_lines OUTPUT.
g_tc_controls_lines = sy-loopc.
move-corresponding it_data to wa_data.
ENDMODULE. " tc_control_get_lines OUTPUT
MODULE tc_control_modify INPUT.
move-corresponding wa_data to it_data.
MODIFY it_data
FROM wa_data
INDEX tc_control-current_line.
append it_data.
clear it_data.
ENDMODULE. " tc_control_modify INPUT
Please suggest me any clue.
Thanks in advance
vijay dwivediHi ,
I have understood the problem.
In your ABAP code replace all the occurrrences of wa_data with the structurename.
Use the TABLES keyword to declare the structure .That structure will be same as
reference table of the table control .
Here the structure is SPFLI.
Check the bellow code , it will resolve the issue.
ABAP code - -
program zsdn.
tables spfli. " Declare the structure
data : it_data like table of spfli with header line,
*wa_data TYPE spfli, " commented
w_i type i.
*CONTROLS TC_CONTROL TYPE TABLEVIEW USING SCREEN 100.
controls: tc_control type tableview using screen 1000,
tc_control1 type tableview using screen 1000.
data: g_tc_controls_lines like sy-loopc,
g_tc_controls_lines1 like sy-loopc.
module tc_control output.
describe table it_data lines tc_control-lines.
endmodule. " TC_CONTROL OUTPUT
module tc_control_get_lines output.
g_tc_controls_lines = sy-loopc.
move-corresponding it_data to spfli.
endmodule. " tc_control_get_lines OUTPUT
module tc_control_modify input.
move-corresponding spfli to it_data.
modify it_data
from spfli
index tc_control-current_line.
append it_data.
clear it_data.
endmodule. " tc_control_modify INPUT
module status_0100 output.
set pf-status 'STAT'.
* SET TITLEBAR 'xxx'.
endmodule. " STATUS_0100 OUTPUT
module user_command_0100 input.
case sy-ucomm.
when 'BACK' or 'EXIT' or 'CANCEL'.
leave to screen 0.
endcase.
endmodule. " USER_COMMAND_0100 INPUT
*& Module POPLATE_TABLE OUTPUT
* text
module poplate_table output.
if it_data is initial.
select * from spfli into table it_data.
endif.
endmodule. " POPLATE_TABLE OUTPUT
Screen code (Scr no 1000) - -
PROCESS BEFORE OUTPUT.
MODULE status_0100.
MODULE poplate_table.
MODULE tc_control.
LOOP AT it_data WITH CONTROL tc_control CURSOR w_i.
MODULE tc_control_get_lines.
ENDLOOP.
PROCESS AFTER INPUT.
MODULE user_command_0100.
LOOP AT it_data.
MODULE tc_control_modify ON CHAIN-REQUEST.
* ENDCHAIN.
ENDLOOP.
Regards
Pinaki -
Bug: 1st Date value returned from CF function as an Array
I am encountering an annoying behavior in Flex. I call a CF function to return a query into an arraycollection - very basic. If the SELECT statement in the CF function includes any dates as returned values, the first date in the SELECT statement comes back into Flex as an array object that includes all of the fields in the query. The other dates are returned normally as Date data types.
This started happening a month or two ago, and I've reproduced it on several machines. Needless to say, it is causing numerous errors when my code excpects a Date and gets an Array.
Do you have any ideas of what might be causing this and what I can do about it?
Thanks!!I'm not sure where to post it, but here are some snippets:
in the cfc, I have a function as such:
<CFFUNCTION name="GetPersonLog" returntype="query">
<CFARGUMENT name="STUDENTID" type="numeric" required="Yes">
<CFQUERY name="qry" datasource="#connection_string#">
SELECT tblStudentLog.STUDENTLOGID, tblStudentLog.STUDENTID,
<!---GETDATE() AS TESTTHIS,--->
tblStudentLog.LOGDATE,
tblStudentLog.LASTUPDATED,
tblStudentLog.LOGENTRY
FROM tblStudentLog LEFT OUTER JOIN
tblStudentDiscipline ON tblStudentLog.StudentDisciplineID = tblStudentDiscipline.StudentDisciplineID
WHERE tblStudentLog.STUDENTID = <CFQUERYPARAM value = "#studentid#">
ORDER BY tblStudentLog.LOGDATE DESC
</CFQUERY>
<CFRETURN qry>
</CFFUNCTION>
You see I have a REMmed out line to get a test date. If I check the results in CF using cfdump, all dates come back as dates.
In flex, the handler for the call to this function is simple:
private function PersonLogHandler(event:ResultEvent):void{
personlog = event.result as ArrayCollection;
CursorManager.removeBusyCursor();}
If I stop the code here and evaluate the results in debug mode, I find that the LOGDATE result is an array of values, while the LASTUPDATED field is a date.
If I put back the GETDATE() AS TESTTHIS, statement in the cfc function, then the result includes LOGDATE as a date object, but now the TESTTHIS result is an array.
I have experienced this on 2 different machines, and it has brought my development to a standstill. I can't thin of what changed to start these problems, except that I am now using the FlashPlayer 10 (with debugger). This happens in Firefox and IE. I am using FlexBuilder 3 and Coldfusion 8. -
Site Web Analytics - no usage data being generated
Hello all:
I have a SharePoint Foundation 2013 farm with 2 WFE - 1 Search Server and 1 DB server. Search Service Application has been configured and functioning properly. Usage and health Data Service Application has been created and started. Usage
data collection is enabled and the "Analytics Usage" check box is checked. Usage Data Import and Usage Data Processing timer jobs are scheduled and run successfully.
But, I still get the following error when I go to the Site Web Analytics "A web analytics report is not available for this site. Usage processing may be disabled on this server or the
usage data for this site has not been processed yet."
After doing some research, some folks have suggested the following which has to do with manually enabling the receivers via powershell - which I have done but still no report and same error.
http://geekswithblogs.net/bjackett/archive/2013/08/26/powershell-script-to-workaround-no-data-in-sharepoint-2013-usage.aspx
Other Internet searches indicate that Web Analytics Reports is no longer available in SharePoint Foundation 2013:
http://blogs.msdn.com/b/chandru/archive/2013/08/31/sharepoint-2013-web-analytics-report-where-is-it.aspx
http://sharepoint.stackexchange.com/questions/63099/where-is-the-web-analytics-service-in-sharepoint-2013
There is also a TechNet question which indicate that "Microsoft Support confirmed me there's a bug in SharePoint Foundation 2013 in the Database that's going to be fixed in the June or August CU"
http://social.technet.microsoft.com/Forums/sharepoint/en-US/5372109c-8a6e-4d31-aa34-13b6cbde52cf/sharepoint-foundation-2013-web-analytics?forum=sharepointgeneral
But, there is no resolution if this bug has been addressed or not.
Therefore, I would really like to know what the deal is with this issue. At the moment, I do not see any usage data being generated on any of the SharePoint Foundation servers in the farm.
Please advise.
Thank you,
RumiHi Rumi,
Find a same issue internaly which says that the links Site Web Analytics is no longer valid in SharePoint 2013 Foundation due to the changes in analytics service application architecture, so you may need the SharPoint enterprise edition for using
this feature.
Symptom
- Recently, we upgraded to SharePoint Foundation 2013 from WSS 3.0. In SharePoint Foundation 2013 sites, we see the option to click on Site Web Analytics reports but when we click on it, we get an error.
- Clicking on Site Web Analytics reports from Site Settings \ Site Actions produces the error: “A web analytics report is not available for this site. Usage processing may be disabled on this server or the usage data for this site has not been processed yet.”
- We have ensured we have logging enabled (multiple categories)
- Example Site: http://sharepoint2/sites/IT/Projects/SAP/_layouts/15/usageDetails.aspx
Cause
By Design
1) The links in Site Settings from a site collection are no longer valid in SharePoint 2013 (due to change in Analytics Service application architecture changes...part of Search Service now)
2) SharePoint Foundation 2013 does not support Usage Reporting Analytics
Resolution
o Purchase a license for SharePoint Server 2013 Enterprise, and build out a farm for it (the Foundation SKU cannot be upgraded in-place to Server).
o Once built up, you could copy your databases over and attach them to the Server farm and do your cutover.
o Going forward from there, you would be able to have access to the Usage reports.
Also as you have found that msdn blog with the explenation that it is not available in SPF 2013.
http://blogs.msdn.com/b/chandru/archive/2013/08/31/sharepoint-2013-web-analytics-report-where-is-it.aspx
http://technet.microsoft.com/en-us/library/jj819267.aspx#bkmk_FeaturesOnPremise
Thanks,
Daniel Yang
Forum Support
Please remember to mark the replies as answers if they help and unmark them if they provide no help. If you have feedback for TechNet Subscriber Support, contact
[email protected].
Daniel Yang
TechNet Community Support -
Hello Guys,
I am creating resultsource from central admin. If I create it from central admin it works fine. But if I am creating result source from power shell scripts it shows me following error message.
An exception of type 'Microsoft.Office.Server.Search.Query.InternalQueryErrorException' occurred in Microsoft.Office.Server.Search.dll but was not handled in user code
Additional information: Search has encountered a problem that prevents results from being returned. If the issue persists, please contact your administrator.
Any suggestion ?
Thanks in Advance.Hi,
Please provide more specific information about the issue. What type of content source you tried creating via powershell?
Make sure you are using the approproate permission and search service application.
Here is the reference for creating content resource via script:
http://technet.microsoft.com/en-us/library/ff607867(v=office.15).aspx
Regards,
Rebecca Tu
TechNet Community Support -
How to create backup file on itunes for ipod touch 4g game apps data? Is there a way to do it? I want to try an app on my friend's computer, but you can't add apps on another computer without having your own ipod's data being deleted. Thx for any help!
I want to know how to create a backup file (because I'm pretty new with itunes, and it's hard to use it for me still), how to store my app data/media/videos, etc. And how I can retrieve them back when I'm done with the app I tried on my friend's computer.
If anyone can help, it'd be great! Thank you so much!Sure-glad to help you. You will not lose any data by changing synching to MacBook Pro from imac. You have set up Time Machine, right? that's how you'd do your backup, so I was told, and how I do my backup on my mac. You should be able to set a password for it. Save it. Your stuff should be saved there. So if you want to make your MacBook Pro your primary computer, I suppose, back up your stuff with Time machine, turn off Time machine on the iMac, turn it on on the new MacBook Pro, select the hard drive in your Time Capsule, enter your password, and do a backup from there. It might work, and it might take a while, but it should go. As for clogging the hard drive, I can't say. Depends how much stuff you have, and the hard drive's capacity. As for moving syncing from your iMac to your macbook pro, should be the same. Your phone uses iTunes to sync and so that data should be in the cloud. You can move your iTunes Library to your new Macbook pro
you should be able to sync your phone on your new MacBook Pro. Don't know if you can move the older backups yet-maybe try someone else, anyways,
This handy article from Apple explains how
How to move your iTunes library to a new computer - Apple Support''
don't forget to de-authorize your iMac if you don't want to play purchased stuff there
and re-authorize your new macBook Pro
time machine is an application, and should be found in the Applications folder. it is built in to OS X, so there is nothing else to buy. double click on it, get it going, choose the Hard drive in your Time capsule/Airport as your backup Time Machine and go for it. You should see a circle with an arrow on the top right hand of your screen (the Desktop), next to the bluetooth icon, and just after the wifi and eject key (looks sorta like a clock face). This will do automatic backups of your stuff. -
My ipod is disabled and the message requests to try again in 223004 minutes. I think it might have something to do with the date being set wrong before it went into disabled mode. Can you assist in enabling.
You'll need to connect it to the iTunes library you normally sync it with and restore it. If iTunes asks you for this passcode before it will let you proceed, connect the iPod to iTunes in recovery mode instead using the instructions in this Apple support document.
iOS: Unable to update or restore
B-rock -
How to delimited text file data being downloaded in Application server
Hi All,
How to delimited or having a tab between each fields in the data being downloaded to Application server. Please provide an example of code or how i should make changes to my below coding. Thanks.
eg. the out file in application server.
Field1#Field2#Field3
what I currently get was as below:-
Field1Field2Field3
My coding:
FORM download_outfile.
DATA : xfer(400).
IF rb_locl EQ 'X'.
DESCRIBE TABLE itab LINES sy-tfill.
IF sy-tfill GT 0.
CALL FUNCTION 'WS_DOWNLOAD'
EXPORTING
filename = p_file
filetype = 'DAT'
TABLES
data_tab = itab
EXCEPTIONS
file_open_error = 1
file_write_error = 2
invalid_filesize = 3
invalid_table_width = 4
invalid_type = 5
no_batch = 6
unknown_error = 7
OTHERS = 8.
IF sy-subrc EQ 0.
WRITE : / 'download done.
ELSE.
WRITE : / '4)Error occured'.
ENDIF.
ENDIF.
ELSE.
OPEN DATASET p_file FOR OUTPUT IN TEXT MODE.
DATA : l_subrc LIKE sy-subrc.
LOOP AT itab.
MOVE itab TO xfer.
TRANSFER xfer TO p_file.
IF sy-subrc NE 0.
l_subrc = sy-subrc.
ENDIF.
ENDLOOP.
IF l_subrc EQ 0.
WRITE :/ 'download done.
ELSE.
WRITE :/ 'error occurred.
ENDIF.
CLOSE DATASET p_file.
ENDIF.
ENDFORMHi,
Please check this sample codes.
OPEN DATASET P_DOWN FOR OUTPUT IN TEXT MODE. " P_DOWN is the file to download
IF SY-SUBRC = 0.
CLEAR V_STRING.
*-- Download to the file
LOOP AT IT_DOWN.
CONCATENATE IT_DOWN-DISTTYPE
IT_DOWN-O_CITY
IT_DOWN-O_REGIO
IT_DOWN-O_PSTLZ
IT_DOWN-O_CTRY
IT_DOWN-D_CITY
IT_DOWN-D_REGIO
IT_DOWN-D_PSTLZ
IT_DOWN-D_CTRY
IT_DOWN-DISTANCE
INTO V_STRING
SEPARATED BY '09'. " here 09 is the tab delimeter
TRANSFER V_STRING TO P_DOWN.
CLEAR V_STRING.
ENDLOOP.
Regards,
Ferry Lianto -
Error with calendar date being extracted : PSA - ODS - cubes correction
Hello to all,
I have an issue with incorrect calendar date being captured by BW.
Here is the analysis:
We have red request into the invoice cubes because of incorrect calendar date. It has been found that the process is done via PSA -> ODS -> Cubes
How will I do the correction? Kindly specify on whether or not to change the status of the red QM status in the cubes. Also with the data mart status of ODS and the request of ODS? What status(has to be changed)? what request has to be deleted? Thanks.Hi,
What I did:
changed QM status to green of the Info Cubes
deleted the green status of requests in the Info Cubes
deleted the Data Mart status of ODS
deleted the request of ODS
edited manually the data into the PSA
started the update immediately
Then....
I will wait if the request will be updated in ODS?
Then
I will check if the delta job from ODS to cube will proceed right?
Please advise. Thanks. -
How Secure is the data being backed up wirelessly?
What is the likelihood of someone being able to access data being wirelessly backed up to a TC?
Thanks.It's pretty dependant on the kind of wireless encryption (if any) that you're using, the most secure being WPA2, with WPA/WPA2 a tiny bit less secure, and WEP about as secure as throwing copies of your house keys attached to photos of the expensive items in your house with your address on the back onto a crowded sidewalk. Okay, so it's not quite that insecure, but it's still pretty crappy.
Assuming you're using WPA2 or WPA/WPA2, your data should be getting pretty securely transmitted, though once it's on the Time Capsule it isn't encrypted, and if someone can physically access it, they can connect to it via Ethernet. If there's stuff you want to keep really secure, you should probably investigate encrypting it, possibly using encrypted disk images, but if your main worry is about it being intercepted in the air, providing you've got good wireless security and a decent password powering it, you should be fine. -
Position of the days and dates being on the right side of the day in the week calendar.
I'm having a huge problem with the days and dates being on the right side of the day in the week calendar. Can I change that to the left upper side?
it makes the whole schedule off balance you see.Thanks for stating that. So quickly. That *****. I also now see that more people have problems with this. Where can I complain?
Maybe you are looking for
-
Error in executing a procedure
Hello All, I am trying to use an update statement in PL/SQL procedure with some date parameters. Can anyone please tell me whats wrong with the syntax. Also please suggest any better ways to do this. Thanks a lot in advance. PROCEDURE CREATE OR REPLA
-
How do you correct the merging of contacts, calendars, etc with other members of your family on your account?
-
Program "SAPLSZA1" tried to use screen 0000 - vendor display XK03 - dump
Hello, when i try to display vendor (XK03) program SAPMF02K falls into dump. Problem is in calling subscreen into address area. CALL SUBSCREEN ADDRESS INCLUDING 'SAPLSZA1' '0300'. this is dump: The termination occurred in the ABAP program "SAPLSZA1 "
-
Can't group slices in pie chart
I used to be able to shift click on a number of pie slices and group them and then drag (explode) the group from the rest of the pie. Doesn't seem to work now. What am I doing wrong?
-
Comparative vendor analysis needed
Hi Everyone! If anyone has a comparative study or analysis of the various MDM vendors in the markets, please forward it to me. Thanks & Regards, Taj.