Re: Data streaming error
Hello,
We are using Jdeveloper 11.1.1.4.0;
A page we developed its working fine in the Jdev Integrated weblogic server. But if we deploy it in the Server, some time its working fine and sometime time its giving the data streaming error,
if we restart the application server, the error is not coming.
I have visited so many forums here, but it’s not helpful to me.
Is there any configuration we have to do it in the weblogic server side or the problem in the code as it mentioned in a forum?
Thanks in Advance.
Vijay.
Hi,
unfortunately your question isn't helpful either as it does not provide enough information (stack traces or log entries for example). As you say this ahppens randomly it would be interesting to know if network can be an issue. Also have you tested your application e.g. with 11.1.1.6 to verify the problem is related to 11.1.1.4 or a general one.
Frank
Similar Messages
-
Data Streaming error in BPM workspace
Hi All
Developed a BPM Process and ADF form with BPM 11g suite and deployed to Oracle weblogic Server 10.3. while we are fetching the data from DB into ADF form in the same Process we are getting the small amount of data but while fetching the large volume of data to that ADF form we are getting the popup like "Data streaming",
Could you please guide me to resolve this issue.
Thank you
Balaji JPlease paste the exact error stack along with your Jdev version for us to help you.
-
Hello All,
We are using a two tier architecture.
Our Corp server calls the refinery server.
Our CORP MII server uses user id abc_user to connect to the refinery data server.
The user id abc_user has the SAP_xMII_Dynamic_Query role.
The data server also has the checkbox for allow dynamic query enabled.
But we are still getting the following error
Error has occurred while processing data stream
Dynamic Query role is not assigned to the Data Server; Use query template
Once we add the SAP_xMII_Dynamic_Query role to the data server everything works fine. Is this feature by design ?
Thanks,
KiranThanks Anushree !!
I thought that just adding the role to the user and enabling the dynamic query checkbox on the data server should work.
But we even needed to add the role to the data server.
Thanks,
Kiran -
Streams apply having 60k no data found error
We have implemented streams recently. The configuration and streaming is happening fine. The only problem I am having is "NO DATA FOUND" error in apply.
It's uni-directional streaming
We are not really updating/deleting/inserting any row on the destination database. (both oracle).
Why am I seeing 60k Conflicts?
Is there any configuration am I missing?
How can I fix the no data found issue and make my stream 100 percent perfect real time.
Thanks in advance
PalaniPrint the transaction/LCR using print_transaction/print_lcr. Compare the old values from the LCR to values in destination database table. There must be one or more column values differ. Trace back to the root operation which is causing the value difference and fix it.
Otherwise you can configure the streams apply process not to compare the old values except keys by using DBMS_APPLY_ADM.COMPARE_OLD_VALUES. -
Error while loading Reported Financial Data from Data Stream
Hi Guys,
I'm facing the following error while loading Reported Financial Data from Data Stream:
Message no. UCD1003: Item "Blank" is not defined in Cons Chart of Accts 01
The message appears in Target Data. Item is not filled in almost 50% of the target data records and the error message appears.
Upon deeper analysis I found that Some Items are defined with Dr./Cr. sign of + and with no breakdown. When these items appear as negative (Cr.) in the Source Data, they are not properly loaded to the target data. Item is not filled up, hence causing the error.
For Example: Item "114190 - Prepayments" is defined with + Debit/Credit Sign. When it is posted as negative / Credit in the source data, it is not properly written to the target.
Should I need to define any breakdown category for these items? I think there's something wrong with the Item definitions OR I'm missing something....
I would highly appreciate your quick assistance in this.
Kind regards,
AmirFound the answer with OSS Note: 642591.....
Thanks -
Most simple query on Event Hub stream (json) constantly gives Data Conversion Errors
Hello all,
Been playing with ASA in December and didn't have any issues, my queries kept working and outputted the data as needed. However, since January, I created a new demo, where I now constantly get Data Conversion errors. The scenario is described
below, but I have the following questions:
Where can I get detailed information on the data conversion errors? I don't get any point now (not in the operation logs and not in the table storage of my diagnostic storage account)
What could be wrong in my scenario and could be causing these issues
The scenario I have implemented is the following:
My local devices send EventData objects, serialized through Json.Net to an Event Hub with 32 partitions.
I define my query input as Event Hub Stream and define the data as json/utf8. I give it the name TelemetryReadings
Then I write my query as SELECT * FROM TelemetryReadings
In the output, I create an output on blob with CSV/UTF8 encoding
After that, I start the job
The result is an empty blob container (no output written) and tons of data conversion errors in the monitoring graph. What should I do to get this solved?
Thanks
Sam Vanhoutte - CTO Codit - VTS-P BizTalk - Windows Azure Integration: www.integrationcloud.euSo, apparently the issue was related to the incoming objects, I had. I was sending unsupported data types (boolean and Dictionary). I changed my code to remove these from the json and that worked out well. There was a change that got deployed
that (instead of marking the unsupported fields as null, they were throwing an exception). That's why things worked earlier.
So, it had to do with the limitation that I mentioned in my earlier comment:
https://github.com/Azure/azure-content/blob/master/articles/stream-analytics-limitations.md
Unsupported type conversions result in NULL values
Any event vales with type conversions not supported in the Data Types section of Azure Stream Analytics Query Language
Reference will result in a NULL value. In this preview release no error logging is in place for these conversion exceptions.
I am creating a blog post on this one
Sam Vanhoutte - CTO Codit - VTS-P BizTalk - Windows Azure Integration: www.integrationcloud.eu -
Oracel B2B UI - Reports - "An unknown error occured during data streaming.
I have a brand new oralce soa suite 11g R1 PS2 installation, following my own instructions even, in which everything seems to work except the B2B Web Console "Reports" functionality. When no messages are available for display (as in none were sent or all were purged) the display looks fine. WHen there are supposed to be messages to display I get the web browser dialogue box saying ""An unknown error occured during data streaming. Ask your system administrator to inspect teh server-side logs." and no data. I performed a clean install on brand new VMs 4 times already, 2-5 hours each time, and every time I get this. I used Windows XP 64-bit, WIndows XP 32-bit, JDK 1.6.0_20 64-bit (on the 64-bit platform), JDK 1.6.0_20 32-bit, Oracel provided JDK 1.6.0_18, all to no avail.
Can some kind soul please tell me what I should look at / fix to have this resolved?
The stack trace is below:
[2010-11-01T20:57:13.256+11:00] [AdminServer] [WARNING] [] [oracle.adfinternal.view.faces.renderkit.rich.SimpleSelectOneRenderer] [tid: [ACTIVE].ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'] [userId: weblogic] [ecid: 0000Ik6rbdg2zGWjLxfP8A1CnclD000079,0] [APP: b2bui] [dcid: f21c2821705695cf:-4862872f:12c06d54454:-7ff0-00000000000000d1] Could not find selected item matching value "" in RichSelectOneChoice[UIXEditableFacesBeanImpl, id=value40]
[2010-11-01T20:57:13.490+11:00] [AdminServer] [ERROR] [] [oracle.adfinternal.view.faces.config.rich.RegistrationConfigurator] [tid: [ACTIVE].ExecuteThread: '1' for queue: 'weblogic.kernel.Default (self-tuning)'] [userId: weblogic] [ecid: 0000Ik6rbtn2zGWjLxfP8A1CnclD00007A,0] [APP: b2bui] [dcid: f21c2821705695cf:-4862872f:12c06d54454:-7ff0-00000000000000d5] Server Exception during PPR, #1[[
javax.servlet.ServletException: java.lang.AssertionError
at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:341)
at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:26)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
at oracle.help.web.rich.OHWFilter.doFilter(Unknown Source)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
at oracle.adfinternal.view.faces.webapp.rich.RegistrationFilter.doFilter(RegistrationFilter.java:97)
at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl$FilterListChain.doFilter(TrinidadFilterImpl.java:420)
at oracle.adfinternal.view.faces.activedata.AdsFilter.doFilter(AdsFilter.java:60)
at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl$FilterListChain.doFilter(TrinidadFilterImpl.java:420)
at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl._doFilterImpl(TrinidadFilterImpl.java:247)
at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl.doFilter(TrinidadFilterImpl.java:157)
at org.apache.myfaces.trinidad.webapp.TrinidadFilter.doFilter(TrinidadFilter.java:92)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
at oracle.tip.b2b.ui.util.SessionTimeoutFilter.doFilter(SessionTimeoutFilter.java:232)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
at oracle.security.jps.ee.http.JpsAbsFilter$1.run(JpsAbsFilter.java:94)
at java.security.AccessController.doPrivileged(Native Method)
at oracle.security.jps.util.JpsSubject.doAsPrivileged(JpsSubject.java:313)
at oracle.security.jps.ee.util.JpsPlatformUtil.runJaasMode(JpsPlatformUtil.java:414)
at oracle.security.jps.ee.http.JpsAbsFilter.doFilter(JpsAbsFilter.java:138)
at oracle.security.jps.ee.http.JpsFilter.doFilter(JpsFilter.java:71)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
at oracle.dms.wls.DMSServletFilter.doFilter(DMSServletFilter.java:330)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
at weblogic.servlet.internal.RequestEventsFilter.doFilter(RequestEventsFilter.java:27)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.doIt(WebAppServletContext.java:3684)
at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3650)
at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:121)
at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2268)
at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2174)
at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1446)
at weblogic.work.ExecuteThread.execute(ExecuteThread.java:201)
at weblogic.work.ExecuteThread.run(ExecuteThread.java:173)
Caused by: java.lang.AssertionError
at oracle.adfinternal.view.faces.renderkit.rich.table.TableRenderingContext.isCurrentCellInRowBanded(TableRenderingContext.java:258)
at oracle.adfinternal.view.faces.renderkit.rich.table.TableRenderingContext.isCurrentCellBanded(TableRenderingContext.java:253)
at oracle.adfinternal.view.faces.renderkit.rich.table.BaseColumnRenderer.renderDataCellClasses(BaseColumnRenderer.java:1036)
at oracle.adfinternal.view.faces.renderkit.rich.table.BaseColumnRenderer.renderDataCell(BaseColumnRenderer.java:1149)
at oracle.adfinternal.view.faces.renderkit.rich.table.BaseColumnRenderer.encodeAll(BaseColumnRenderer.java:103)
at oracle.adf.view.rich.render.RichRenderer.encodeAll(RichRenderer.java:1369)
at org.apache.myfaces.trinidad.render.CoreRenderer.encodeEnd(CoreRenderer.java:335)
at org.apache.myfaces.trinidad.component.UIXComponentBase.encodeEnd(UIXComponentBase.java:765)
at org.apache.myfaces.trinidad.render.CoreRenderer.encodeChild(CoreRenderer.java:415)
at oracle.adf.view.rich.render.RichRenderer.encodeChild(RichRenderer.java:2567)
at oracle.adfinternal.view.faces.renderkit.rich.TableRenderer.renderDataBlockRows(TableRenderer.java:1932)
at oracle.adfinternal.view.faces.renderkit.rich.TableRenderer._renderSingleDataBlock(TableRenderer.java:1601)
at oracle.adfinternal.view.faces.renderkit.rich.TableRenderer._handleDataFetch(TableRenderer.java:1003)
at oracle.adfinternal.view.faces.renderkit.rich.TableRenderer.encodeAll(TableRenderer.java:504)
at oracle.adf.view.rich.render.RichRenderer.encodeAll(RichRenderer.java:1369)
at org.apache.myfaces.trinidad.render.CoreRenderer.encodeEnd(CoreRenderer.java:335)
at org.apache.myfaces.trinidad.component.UIXComponentBase.encodeEnd(UIXComponentBase.java:765)
at org.apache.myfaces.trinidad.component.UIXCollection.encodeEnd(UIXCollection.java:529)
at org.apache.myfaces.trinidad.component.UIXComponentBase.__encodeRecursive(UIXComponentBase.java:1515)
at org.apache.myfaces.trinidad.component.UIXComponentBase.encodeAll(UIXComponentBase.java:785)
at oracle.adfinternal.view.faces.util.rich.InvokeOnComponentUtils$EncodeChildVisitCallback.visit(InvokeOnComponentUtils.java:113)
at org.apache.myfaces.trinidadinternal.context.PartialVisitContext.invokeVisitCallback(PartialVisitContext.java:222)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:378)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:448)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:448)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:448)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:326)
at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:443)
at oracle.adfinternal.view.faces.util.rich.InvokeOnComponentUtils.renderChild(InvokeOnComponentUtils.java:43)
at oracle.adfinternal.view.faces.streaming.StreamingDataManager._pprComponent(StreamingDataManager.java:611)
at oracle.adfinternal.view.faces.streaming.StreamingDataManager.execute(StreamingDataManager.java:460)
at oracle.adfinternal.view.faces.renderkit.rich.DocumentRenderer._encodeStreamingResponse(DocumentRenderer.java:3200)
at oracle.adfinternal.view.faces.renderkit.rich.DocumentRenderer.encodeAll(DocumentRenderer.java:1245)
at oracle.adf.view.rich.render.RichRenderer.encodeAll(RichRenderer.java:1369)
at org.apache.myfaces.trinidad.render.CoreRenderer.encodeEnd(CoreRenderer.java:335)
at org.apache.myfaces.trinidad.component.UIXComponentBase.encodeEnd(UIXComponentBase.java:765)
at org.apache.myfaces.trinidad.component.UIXComponentBase.__encodeRecursive(UIXComponentBase.java:1515)
at org.apache.myfaces.trinidad.component.UIXComponentBase.encodeAll(UIXComponentBase.java:785)
at javax.faces.component.UIComponent.encodeAll(UIComponent.java:942)
at com.sun.faces.application.ViewHandlerImpl.doRenderView(ViewHandlerImpl.java:271)
at com.sun.faces.application.ViewHandlerImpl.renderView(ViewHandlerImpl.java:202)
at javax.faces.application.ViewHandlerWrapper.renderView(ViewHandlerWrapper.java:189)
at org.apache.myfaces.trinidadinternal.application.ViewHandlerImpl.renderView(ViewHandlerImpl.java:193)
at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl._renderResponse(LifecycleImpl.java:710)
at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl._executePhase(LifecycleImpl.java:273)
at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl.render(LifecycleImpl.java:205)
at javax.faces.webapp.FacesServlet.service(FacesServlet.java:266)
at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:300)
... 34 more
]]Hello, Anuj.
Thanks for getting back to me. I did try with another browser.
Furtehr investigatio tells me that this issue is somehow caused by the domain creation process when I create a single server domain with SOA Suite and OSB. When I create a separate domain for OSB all works as expected (as far as I can tell).
Regards
Michael -
NW 6.5 Restore error 20550 Invalid Data Stream
Hi Guys,
Trying to perform a restore using Yosemite Server Backup 8.9 and recieving the above error.
out of 36GB 99% was restore correctly but 6000 files were missing with the above error.
Anybody have any idea what invalid data stream means?
Any help much appreciated
RobbieRobbiecookie101,
> Anybody have any idea what invalid data stream means?
That it had problems readin the media? Anyway, what does the vendor
say?
- Anders Gustafsson (Sysop)
The Aaland Islands (N60 E20)
Novell has a new enhancement request system,
or what is now known as the requirement portal.
If customers would like to give input in the upcoming
releases of Novell products then they should go to
http://www.novell.com/rms -
Network Stream Error -314340 due to buffer size on the writer endpoint
Hello everyone,
I just wanted to share a somewhat odd experience we had with the network stream VIs. We found this problem in LV2014 but aren't aware if it is new or not. I searched for a while on the network stream endpoint creation error -314340 and couldn't come up with any useful links to our problem. The good news is that we have fixed our problem but I wanted to explain it a little more in case anyone else has a similar problem.
The specific network stream error -314340 should seemingly occur if you are attempting to connect to a network stream endpoint that is already connected to another endpoint or in which the URL points to a different endpoint than the one trying to connect.
We ran into this issue on attempting to connect to a remote PXI chassis (PXIe-8135) running LabVIEW real-time from an HMI machine, both of which have three NICs and access different networks. We have a class that wraps the network stream VIs and we have deployed this class across four machines (Windows and RT) to establish over 30 network streams between these machines. The class can distinguish between messaging streams that handle clusters of control and status information and also data streams that contain a cluster with a timestamp and 24 I16s. It was on the data network streams that we ran into the issue.
The symptoms of the problem were that we if would attempt to use the HMI computer with a reader endpoint specifying the URL of the writer endpoint on the real-time PXI, the reader endpoint would return with an error of -314340, indicating the writer endpoint was pointing to a third location. Leaving the URL blank on the writer endpoint blank and running in real-time interactive or startup VI made no difference. However, the writer endpoint would return without error and eventually catch a remote endpoint destroyed. To make things more interesting, if you specified the URL on the writer endpoint instead of the reader endpoint, the connection would be made as expected.
Ultimately through experimenting with it, we found that the buffer size of the create writer endpoint for the data stream was causing the problem and that we had fat fingered the constants for this buffer size. Also, pre-allocating or allocating the buffer on the fly made no difference. We imagine that it may be due to the fact we are using a complex data type with a cluster with an array inside of it and it can be difficult to allocate a buffer for this data type. We guess that the issue may be that by the reader endpoint establishing the connection to a writer with a large buffer size specified, the writer endpoint ultimately times out somewhere in the handshaking routine that is hidden below the surface.
I just wanted to post this so others would have a reference if they run into a similar situation and again for reference we found this in LV2014 but are not aware if it is a problem in earlier versions.
Thanks,
CurtissHi Curtiss!
Thank you for your post! Would it be possible for you to add some steps that others can use to reproduce/resolve the issue?
Regards,
Kelly B.
Applications Engineering
National Instruments -
11:56:37 F01F Soap Attachment Stream Error: [8911], File Size = 1120068 Written Size=131072
11:56:37 F01F Soap Attachment Stream Error: [8911], File Size = 1120068 Written Size=131072
11:56:37 F01F Error streaming an attachment [8911]
11:56:37 F277 Soap Attachment Stream Error: [8911], File Size = 673015 Written Size=131072
11:56:37 F277 Soap Attachment Stream Error: [8911], File Size = 673015 Written Size=131072
11:56:37 F277 Error streaming an attachment [8911]
What are these?Originally Posted by swishewk
11:56:37 F01F Soap Attachment Stream Error: [8911], File Size = 1120068 Written Size=131072
11:56:37 F01F Soap Attachment Stream Error: [8911], File Size = 1120068 Written Size=131072
11:56:37 F01F Error streaming an attachment [8911]
11:56:37 F277 Soap Attachment Stream Error: [8911], File Size = 673015 Written Size=131072
11:56:37 F277 Soap Attachment Stream Error: [8911], File Size = 673015 Written Size=131072
11:56:37 F277 Error streaming an attachment [8911]
What are these?
I believe the lines with the file size/written size are files in excess of the limit set in Data Synchronizer. -
Data Conversion Errors for the last week
We've been running a simple Stream Analytics job for little over a month now with a very light workload. Input is Event hub and output SQL Server. We noticed today that we haven't received anything into SQL Server since 2014-12-08 (we don't receive events
every day so we only know that everything still worked on the 8th of December), so we checked the job's logs. It seems that job is failing to process all the messages: The value of "Data Conversion Errors" is high.
I wonder what could have happened? We haven't touched the client since we started the job so it's still sending the messages in same format. And we haven't touched the job's query either.
Has there been an update to either to Stream Analytics or to Events Hub which could cause the issue we're seeing?I've followed word for word the TollApp Instructions (except the thing with NamespaceType "Messaging" that has been added to New-AzureSBNamespace).
I have 0 line in output, and this is the service log:
Correlation ID:
e94f5b9e-d755-4160-b49e-c8225ceced0c
Error:
Message:
After deserialization, 0 rows have been found. Possible reasons could be a missing header or malformed CSV input.
Message Time:
2015-01-21 10:35:15Z
Microsoft.Resources/EventNameV2:
sharedNode92F920DE-290E-4B4C-861A-F85A4EC01D82.entrystream_0_c76f7247_25b7_4ca6_a3b6_c7bf192ba44a#0.output
Microsoft.Resources/Operation:
Information
Microsoft.Resources/ResourceUri:
/subscriptions/eb880f80-0028-49db-b956-464f8439270f/resourceGroups/StreamAnalytics-Default-West-Europe/providers/Microsoft.StreamAnalytics/streamingjobs/TollData
Type:
CsvParserError
Then I stopped the job, and connected to the event hub with a console app and received that:
Message received. Partition: '11', Data: 'TollId,EntryTime,LicensePlate,State,Make,Model,VehicleType,VehicleWeight,Toll,Tag
85,21/01/2015 10:24:56,QBQ 1188,OR,Toyota,4x4,1,0,4,361203677
Message received. Partition: '11', Data: 'TollId,EntryTime,LicensePlate,State,Make,Model,VehicleType,VehicleWeight,Toll,Tag
33,21/01/2015 10:25:42,BSE 3166,PA,Toyota,Rav4,1,0,6,603558073
Message received. Partition: '11', Data: 'TollId,EntryTime,LiMessage received. Partition: '10', Data: 'TollId,EntryTime,LicensePlate,State,Make,Model,VehicleType,VehicleWeight,Toll,Tag
59,21/01/2015 10:23:59,AXD 1469,CA,Toyota,Camry,1,0,6,150568526
Message received. Partition: '10', Data: 'TollId,EntryTime,LicensePlate,State,Make,Model,VehicleType,VehicleWeight,Toll,Tag
25,21/01/2015 10:24:17,OLW 6671,NJ,Honda,Civic,1,0,5,729503344
Message received. Partition: '10', Data: 'TollId,EntryTime,LicensePlate,State,Make,Model,VehicleType,VehicleWeight,Toll,Tag
51,21/01/2015 10:24:23,LTV 6699,CA,Honda,CRV,1,0,5,169341662
Note the bug on the 3rd message. In my opinion it's unrelated, it could be the writeline that can't keep up with the stream in the console application. And at worst it's in the stream, but then I should see at least some lines in output for the correctly
formatted messages. -
Problems Reading SSL server socket data stream using readByte()
Hi I'm trying to read an SSL server socket stream using readByte(). I need to use readByte() because my program acts an LDAP proxy (receives LDAP messages from an LDAP client then passes them onto an actual LDAP server. It works fine with normal LDAP data streams but once an SSL data stream is introduced, readByte just hangs! Here is my code.....
help!!! anyone?... anyone?
1. SSL Socket is first read into " InputStream input"
public void run()
Authorization auth = new Authorization();
try {
InputStream input = client.getInputStream();
while (true)
{ StandLdapCommand command;
try
command = new StandLdapCommand(input);
Authorization t = command.get_auth();
if (t != null )
auth = t;
catch( SocketException e )
{ // If socket error, drop the connection
Message.Info( "Client connection closed: " + e );
close( e );
break;
catch( EOFException e )
{ // If socket error, drop the connection
Message.Info( "Client connection close: " + e );
close( e );
break;
catch( Exception e )
//Way too many of these to trace them!
Message.Error( "Command not processed due to exception");
close( e );
break;
//continue;
processor.processBefore(auth, command);
try
Thread.sleep(40); //yield to other threads
catch(InterruptedException ie) {}
catch (Exception e)
close(e);
2 Then data is sent to an intermediate function
from this statement in the function above: command = new StandLdapCommand(input);
public StandLdapCommand(InputStream in) throws IOException
message = LDAPMessage.receive(in);
analyze();
Then finally, the read function where it hangs at "int tag = (int)din.readByte(); "
public static LDAPMessage receive(InputStream is) throws IOException
* LDAP Message Format =
* 1. LBER_SEQUENCE -- 1 byte
* 2. Length -- variable length = 3 + 4 + 5 ....
* 3. ID -- variable length
* 4. LDAP_REQ_msg -- 1 byte
* 5. Message specific structure -- variable length
DataInputStream din = new DataInputStream(is);
int tag = public static LDAPMessage receive(InputStream is) throws IOException
* LDAP Message Format =
* 1. LBER_SEQUENCE -- 1 byte
* 2. Length -- variable length = 3 + 4 + 5 ....
* 3. ID -- variable length
* 4. LDAP_REQ_msg -- 1 byte
* 5. Message specific structure -- variable length
DataInputStream din = new DataInputStream(is);
int tag = (int)din.readByte(); // sequence tag// sequence tag
...I suspect you are actually getting an Exception and not tracing the cause properly and then doing a sleep and then getting another Exception. Never ever catch an exception without tracing what it actually is somewhere.
Also I don't know what the sleep is supposed to be for. You will block in readByte() until something comes in, and that should be enough yielding for anybody. The sleep is just literally a waste of time. -
TSV_TNEW_PAGE_ALLOC_FAILED - BCS load from data stream task
Hi experts,
We had a short dump when executing BCS Load from Data Stream task. The message is: TSV_TNEW_PAGE_ALLOC_FAILED.
No storage space available for extending an internal table.
What happened? How we can solve this error?
Thanks
MariliaHi,
Most likely, the remedy for your problem is the same as in my answer to your another question:
Raise Exception when execute UCMON -
SEM-BCS:Data Stream Upload
Hi! All.
I am facing an issue in Data Stream upload.....The Target field 0Company is 6 char. and the source field 0Company code is 4 char. in length....the system gives an error ...the value Target field exceeds source field..use info object with greater length.....however, upon mapping Company to another infoobject...with 6 char..used instead of 0comp-Code...the system returns yet
another error that its not coming from source system or source system can't be determined i.e. the new info object.......
I was thinking changing target field length to four...any work around this issue...If the target field is changed i.e. 0company what implications will it have or just changing the field in data model do the trick??
Thanks for your input....
VictorHello Viktor,
If i understood well your problem i faced the same thing in the past.
When costumizing the load from data stream include the lenght to 4 characters or try to include an offset of 2.
Hope is helps.
If yes award points.
Best regards,
João Arvanas -
Problem accessing web service, error 2032 Stream Error
I'm building a small flex app hat uses two web services
provided by the same coldfusion CFC.
I've verified that I can successfully access the web services
via HTTP, and one of the web services actually does work when
called. Both return structures that contain a string var, a numeric
var, and one or more query vars.
Here is my web service declaration:
quote:
<mx:WebService id="ws"
wsdl="https://www.it.dev.duke.edu/components/dukemagsearch/checkMailing2.cfc?wsdl"
useProxy="false">
<mx:operation name="queryDB" result="queryDBResult()"
fault="queryDBFault(event)">
<mx:request>
<RUNDATE>{cboRunDate.selectedItem.XDATE}</RUNDATE>
<ENTITYID>{txtEntityID.text}</ENTITYID>
<LASTNAME>{txtLastName.text}</LASTNAME>
<FIRSTNAME>{txtFirstName.text}</FIRSTNAME>
<MINITIAL>{txtMiddleInitial.text}</MINITIAL>
<PRFSCHCD>{cboPreferredSchool.selectedItem.TABLKEY}</PRFSCHCD>
<PRRECTYP>{cboRecordType.selectedItem.TABLKEY}</PRRECTYP>
<PRFCLASS>{txtPreferredClass.text}</PRFCLASS>
</mx:request>
</mx:operation>
<mx:operation name="getListData"
result="getListDataResult()" fault="getListDataFault()"/>
</mx:WebService>
Sorry if the formatting sucks, I don't know how to post
"code" here.
Anyway, the getListData() method works fine and populates my
list boxes. But when I call the queryDB() method, I get the
following fault event:
quote:
[FaultEvent fault=[RPC Fault faultString="HTTP request error"
faultCode="Server.Error.Request" faultDetail="Error: [IOErrorEvent
type="ioError" bubbles=false cancelable=false eventPhase=2
text="Error #2032: Stream Error. URL:
https://www.it.dev.duke.edu/components/dukemagsearch/checkMailing2.cfc"].
URL:
https://www.it.dev.duke.edu/components/dukemagsearch/checkMailing2.cfc"]
messageId="1E553B99-DF28-ED72-62B8-B84AA3919F9A" type="fault"
bubbles=false cancelable=true eventPhase=2]
I've looked all over the place and I can't seem to find what
this error means. I've tried all kinds of different ways of doing
things. Right now I'm browsing the flex app via a file URL (C:\...)
but I tried putting it up on the server too and that didn't work
either.
The method call *DOES* work when called via HTTP... ie
https://www.it.dev.duke.edu/components/dukemagsearch/checkMailing2.cfc?method=queryDB&RUND ATE=2006-05-19&FIRSTNAME=&LASTNAME=SMITH&MINITIAL=&PRFSCHCD=&PRRECTYP=AL&PRFCLASS=&ENTITYI D=
(You have to be logged in for that to work so you'll just
have to trust me, it returns no records if you're not logged into
the web site already).
I'm totally stressing out about this because I've essentially
spent the entire day since 8am trying to solve this. The
application should've taken 15 minutes total.
HELP!
Thanks for any suggestions y'all have.
RickIt means your output was not formed correctly and could not
be parsed. set up a server side script or something to check that
the output is indeed what you think it should be, 99% of the time i
get thie error its due to malformed output from my webservice or
db. Also try making an xml model of your target data to test the
application internaly, look up model in the docs, it easy to use
and if the model works then you know the data is faulty and you
need to check your output and queries.
Maybe you are looking for
-
Looking for a Calendar Creator type software that works with Maverick for iMac / MacBook. Please help.
-
hi in my spreport i am getting this enq problem pls tell me how to solve this. Top 5 Timed Events Avg %Total ~~~~~~~~~~~~~~~~~~ wait Call Event Waits Time (s) (ms) Time buffer busy waits 1,480,241 1,024,132 692 64.5 enq: HW - contention 142,929 389,7
-
Work Center displaying wrong in CO14 Transaction
Dear Experts We are facing a Problem at time of Confirmation Through CO11N,Under work center field there are 2 fields Current to Confirm & Planned to Confirm,user is entering same Work Center in both fields,But when we save the confirmation & check i
-
SUP 2.0 Update operation Failing
Been banging my head against the wall with this all day: I have an MBO that joins two tables together - sales_order_items and product - from the sampledb I have an update operation that runs fine standalone, but fails to execute when I try to test it
-
Want a fixed analog audio output from new mini
I have the mini to rca cable