HTTP XI - Data Buffering
Hi Everyone,
I am using HTTP XI action block in Business logic transaction to send the XML document to PI system which in turn sends to ECC.
I am trying to test the data buffering capability with following steps
1. Locked PI user ID in PI system that is being used by this action block to communicate with PI.
2.Then executed the transaction in MII, HTTP XI action block returns
Success property as "True" and the transaction processed successfully.
3. After sometime, I unlocked PI user id in PI system.
4. But I couldn't see the message in PI after I unlocked the user ID.
Please advice what needs to be done. Is something I am missing here?
Thanks
Mahesh
All,
I configured the HTTP XI action block by setting the below value for parameters in the configuration link for data buffering.
Property Value
Day Retention 7
MaxRetryCount 50
MaxInterval 30000
Processing Type Exactly Once in Order
Apart from the above, general parameters for connecting PI( server name, service , Interface) is given.
To test this data buffering we bring down the PI system. What I see is
1. During PI is down, we posted couple of transaction in MII. I have seen those transaction in data buffering screen of MII
for a minute and after that it is disappeared.
2. After 30 minute, we bring up the PI system, but I couldn't see any of the transaction data that went to PI from MII system.
Also I couldn't trace those message in MII.
I am not sure what configuration I miss to make this work.
Appreciate If anyone could provide any input.
Thanks
Mahesh
Similar Messages
-
Conditional indicators and data buffers
Hi,
when I was reading help on VI memory usage http://zone.ni.com/reference/en-XX/help/371361G-01/lvconcepts/vi_memory_usage/, I did not completely understand the part about Conditional indicators and data buffers.
Could someone provide one or two examples of it?
Thanks,
Andrej
Solved!
Go to Solution.Hi Jeff,
Sorry that I still couldn't understand the Conditional indicators and data buffers.
I tried checking the code provided by you using "Show Buffer Allocation" in LV2012. I can see that both the code allocates same number of bufferes. i.e., the one which has Conditional indicator also created a buffer as below (Notice black square dot in Add function). If conditional indicator does create additional buffers as you explained in previous post, why equal number of buffer is created in non-conditional indicators also? Can you please explain this?
Thanks,
Ajay.
Attachments:
Conditional Indiator.png 34 KB -
HTTP XML Data Source authentication error
Morning All,
I have just started playing around with using XML as a data source for my CR4Ev2.0.1 reports.
I have managed to create the local inline reports and change their data source loacation with no current
issues.
I am now trying to create the HTTPXML reports using code from Ted's BLOG.
I manage to create a new report, and can open it in design, I can see the data source structure but as
soon as I try browse the data or preview my report I get the following error
Cannot Open file
Server returned HTTP response code: 407 for URL:
http://myserver/httpxml.xml
I had a quick look up the 407 error and it seems to be a Proxy authentication error, we do have a proxy
server setup which requires a username and password.
Before I go through what I have tried I want to confess that for this httpxml.xml file all I did was take my
inline.xml file and rename it and make it avaliable via a URL. So this might be the whole problem but I
am sure this is ok.
So I have tried turning off the proxy server as this URL is local so I dont need to go through the proxy.
If I go to the URL in my browser it works fine and I do not need to enter any proxy details.
Thats the problem with my HTTP XML attempt.
If I use the code as is, from Ted's BLOG, for
Crystal Reports off of HTTP XML data URL
and try use the following 2 URLs
propertyBag.put(
"Http(s) XML URL", "http://resources.businessobjects.com/support/downloads/samples/cr/customer_db/orders.xml");
propertyBag.put(
"Http(s) Schema URL", "http://resources.businessobjects.com/support/downloads/samples/cr/customer_db/orders.xsd");
propertyBag.put(
"Http(s) XML URL", "http://resources.businessobjects.com/support/downloads/samples/cr/customer_db/customer.xml");
propertyBag.put(
"Http(s) Schema URL", "http://resources.businessobjects.com/support/downloads/samples/cr/customer_db/customer.xsd");
But I get the following errors
Exception in thread "main" com.crystaldecisions.sdk.occa.report.lib.ReportSDKLogonException: Logon
Error: {0}---- Error code:-2147217393 Error code name:dbLogonFailed
at com.businessobjects.reports.sdk.JRCCommunicationAdapter.a(Unknown Source)
at com.crystaldecisions.sdk.occa.report.application.DatabaseController.byte(Unknown
Source)
at com.crystaldecisions.sdk.occa.report.application.DatabaseController.addTable(Unknown
Source)
at com.businessobjects.samples.CreateHttpXML.main(CreateHttpXML.java:90)
Caused by: com.crystaldecisions.reports.common.LogonFailureException: Logon Error:{0}
at com.crystaldecisions.reports.queryengine.Connection.br(Unknown Source)
at com.crystaldecisions.reports.queryengine.Connection.bs(Unknown Source)
Any ideas as to what my issues are or what I am missing.
Thanks in advance.
Darren
Edited by: Darren Jackson on Apr 28, 2009 2:12 PM
Is there any documentation as to what entries I can use in the property bag.
Like is there a ("ProxySet", false) or something along those lines?
Edited by: Darren Jackson on Apr 28, 2009 5:28 PM
I have made a little more head way.
I saved the Orders.xml and Orders.xsd files onto my webserver and editied the code to only worry about
the Orders and ignore the Customer files.
I now create my reports, open them in Eclipse to design them, but when I try preview the data it is now
asking me for a username and password. I have tried all combinations that I can think of for our systems
but none work.
GrrrrrrrrrrrrrrrI have subsequently determined that my main problem was my "cached" proxy settings within Eclipse.
Even though I removed the proxy settings in Eclipse, it still required me to restart Eclipse after which it
all started working ok.
That was my main problem, but I am still interested in the Property Bag options that I mentioned before.
If proxy settings are needed, how would one go about setting these details?
Thanks
Darren -
Info on SAP JRA data buffering
Hello,
I'd need help on data buffering used with JRA remote function calls.
This is what is written in the documentation.
● DaysRetention
The number of days the system keeps the data buffer entry
● MaxRetryCount
The maximum number of times you can resubmit requests
● RetryInterval
The number of milliseconds the system waits before resubmitting the query or action request. The scheduler adds one minute to this time.
I'm wondering how to correctly populate these 3 values. It's not clear to me.
Is the data buffering activated only if the communication via RFC is unavailable?
If I'd like MII to repeat the RFC call for maximum ten times every 5 minutes, how can I configure the data buffering accordingly?Mauro,
Data buffering is for errors in communication as stated in the first sentence under the Use heading in the help. You are interested in the MaxRetryCount and RetryInterval parameters. I am not sure if your situation calls for changing the DaysRetention parameter, the default is 7 days.
So...
MaxRetryCount = 10
RetryInterval = 5 min (5 * 60* 1000) = 300,000 ms
Or if the Scheduler's extra minute throws you off, use 4601000 = 240,000 ms
Regards,
Kevin -
I have a VI that writes to a network shared variable using DataSocket. The DataSocket URL uses PSP. I have another VI that reads the network shared variable also using DataSocket. I am experimenting with data buffering to see when data is lost if the Writer VI writes faster than the Reader VI. Is data buffered using DataSocket with PSP in the URL? If not, I expect data will be lost. If it is buffered, I don't expect data to be lost until the buffer is full or overflows.
Attached is a project with the network shared variables and the Writer and Reader VI. VIs compare reading and writing directly using a shared variable node and using DataSocket. With DataSocket, I am experiencing data loss as if there is no buffering. When using the shared variable node, I do not see data loss. Run the Reader.vi. It will read two network shared variables every two seconds. One variable is read using DataSocket and one is read using a variable node. Next, run the Writer.vi. It will write to two network shared variables every 0.5 seconds. One variable is written using DataSocket and one is written using a variable node. Since the Writer VI is writing four times as fast as the Reader VI data will need to be buffered to avoid data loss. Monitor the Buffered Sequence and BufferedDS Sequence front panel indicators in the Reader VI. Buffered Sequence is data from the variable node. BufferedDS Sequence is data from the DataSocket Read.
Solved!
Go to Solution.
Attachments:
Net Share Var & DataSocket.zip 49 KBDoes PSP in the DataSocket URL change the data buffering? Attached is a page from 'LabVIEW 8.5.1 help/fundamentals/networking in LabVIEW/concepts/choosing among LabVIEW communication features' mentioning lossless data transmission for DataSocket with psp protocol(2nd row in table). Does lossless data indicate one packet will be guarantied to be sent from the writer and received by the reader; or, does it provide the guaranty with additional packets buffered?
Attachments:
LabVIEW Communication Features.pdf 61 KB -
Tracking objects in oracle data buffers
hi all,
i am trying to move data blocks from the cold region of "default pool" to the hot region of "keep pool" and would like to know if theres a sweet and simple way to find out and achieve this?
comments and inputs would be appreciated.
thanks.User1082,
I'm not sure if I understand you correctly but would like you to note that the KEEP pool is separate RAM region +(db_keep_cache_size)+ in additon to the db_cache_size parameter, which creates the DEFAULT pool.
Determining the actual size for data buffers can be critical depending on the size of the database, and with the KEEP pool there to have a database buffer hit ratio of 100%, it becomes a bit difficult to moving small tables and indexes into the KEEP pool. Hence, to achieve this, you should (a) be aware of the most frequently accessed tables in your DB, and (b) know of tables with high buffer residency.
One recommended approach (by Donald Burleson) to assigning KEEP pool contents is to assign objects (with 80% or more data blocks in buffer) to the KEEP pool. This could either be executed manually or via dbms_job, on a rotationary basis. You can repeat the same for assigning other objects (with 80% or more data blocks in physical RAM) to the KEEP pool.
For further understanding, please refer to Oracle Performance Tuning Guide provided by Aman.
Hope this helps.
Regards,
Naveed. -
I have virtualized a bare metal db machine. The vm host is esxi 5.5. The disk subsystem, as it pertains to sql, on vmware is much faster than bare metal. However, cashed operation is taking much longer. For instance, I have a query
which joins 2 tables with no resulted rows. In both cases, they should be in data buffers. Nevertheless, bare metal takes 8 seconds to complete and virtualized takes 16 seconds to complete. Why?
In both cases, I blew buffer, ran query twice and took the last result. In both case disk activity was negligible.
Thanks in advanceAre these tables are properly updated for indexes and update stats? Please check this and confirm.
Also what about execution plan as asked above?
Please share the sql server version and check if both have same versions in terms of latest cu and hotfixes apart from same versions and SP.
Santosh Singh -
Single admin server and two managed servers in a cluster. Using NES plugin API from three web servers to the cluster. EJB is deployed with PersistenceStore=Memory.
Session data is not replicating. Cluster is configured for Round-robin. One jsp hits server (A) the next hits server (B). Well,
server (B) requires the session data from the first jsp.
Anyone have any idea where I should look next to troubleshoot this.
TIAHi.
You need to make sure you have PersistentStoreType set in your weblogic.xml as follows in order to replicate http session data:
<session-descriptor>
<session-param>
<param-name> PersistentStoreType </param-name>
<param-value> replicated </param-value>
</session-param>
</session-descriptor>
HTH,
Michael
Tim Perkins wrote:
Single admin server and two managed servers in a cluster. Using NES plugin API from three web servers to the cluster. EJB is deployed with PersistenceStore=Memory.
Session data is not replicating. Cluster is configured for Round-robin. One jsp hits server (A) the next hits server (B). Well,
server (B) requires the session data from the first jsp.
Anyone have any idea where I should look next to troubleshoot this.
TIA--
Michael Young
Developer Relations Engineer
BEA Support -
Hi,
I am really struggling with HTTP data submission from a LiveCycle form.
There seems to be loads of code samples for ColdFusion, but nothing for ASP/ASP.net - all I am looking to do is submit the data, post a response back to the user confirming it has been received and save the posted data to a DB.
I have used the standard request command, but when I post the form I am just getting a general error.
Can anyone help with some code snippets that will get me going on this?
Many thanks in advance.
GarethWe have a separate forum for Designer. Please repost in the LiveCycle Designer forum.
-
HTTP post data from the Oracle database to another web server
Hi ,
I have searched the forum and the net on this. And yes I have followed the links
http://awads.net/wp/2005/11/30/http-post-from-inside-oracle/
http://manib.wordpress.com/2007/12/03/utl_http/
and Eddie Awad's Blog on the same topic. I was successful in calling the servlet but I keep getting errors.
I am using Oracle 10 g and My servlet is part of a ADF BC JSF application.
My requirement is that I have blob table in another DB and our Oracle Forms application based on another DB has to view the documents . Viewing blobs over dblinks is not possible. So Option 1 is to call a procedure passing the doc_blob_id parameter and call the web server passing the parameters.
The errors I am getting is:
First the parameters passed returned null. and
2. Since my servlet directly downloads the document on the response outputStream, gives this error.
'com.evermind.server.http.HttpIOException: An established connection was aborted by the software in your host machine'
Any help please. I am running out of time.
Thanksuser10264958 wrote:
My requirement is that I have blob table in another DB and our Oracle Forms application based on another DB has to view the documents . Viewing blobs over dblinks is not possible. Incorrect. You can use remote LOBs via a database link. However, you cannot use a local LOB variable (called a LOB <i>locator</i>) to reference a remote LOB. A LOB variable/locator is a pointer - that pointer cannot reference a LOB that resides on a remote server. So simply do not use a LOB variable locally as it cannot reference a remote LOB.
Instead provide a remote interface that can deal with that LOB remotely, dereference that pointer on the remote system, and pass the actual contents being pointed at, to the local database.
The following demonstrates the basic approach. How one designs and implements the actual remote interface, need to be decided taking existing requirements into consideration. I simply used a very basic wrapper function.
SQL> --// we create a database link to our own database as it is easier for demonstration purposes
SQL> create database link remote_db connect to scott identified by tiger using
2 '(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=127.0.0.1)(PORT=1521))(CONNECT_DATA=(SID=dev)(SERVER=dedicated)))';
Database link created.
SQL> --// we create a table with a CLOB that we will access via this db link
SQL> create table xml_files( file_id number, xml_file clob );
Table created.
SQL> insert into xml_files values( 1, '<root><text>What do you want, universe?</text></root>' );
1 row created.
SQL> commit;
Commit complete.
SQL> --// a local select against the table works fine
SQL> select x.*, length(xml_file) as "SIZE" from xml_files x;
FILE_ID XML_FILE SIZE
1 <root><text>What do you want, universe?</text></root> 53
SQL> --// a remote select against the table fails as we cannot use remote pointers/locators
SQL> select * from xml_files@remote_db x;
ERROR:
ORA-22992: cannot use LOB locators selected from remote tables
no rows selected
SQL> //-- we create an interface on the remote db to deal with the pointer for us
SQL> create or replace function ReturnXMLFile( fileID number, offset integer, amount integer ) return varchar2 is
2 buffer varchar2(32767);
3 begin
4 select
5 DBMS_LOB.SubStr( x.xml_file, amount, offset )
6 into
7 buffer
8 from xml_files x
9 where x.file_id = fileID;
10
11 return( buffer );
12 end;
13 /
Function created.
SQL> --// we now can access the contents of the remote LOB (only in 4000 char chunks using this example)
SQL> select
2 file_id,
3 ReturnXMLFile@remote_db( x.file_id, 1, 4000 ) as "Chunk_1"
4 from xml_files@remote_db x;
FILE_ID Chunk_1
1 <root><text>What do you want, universe?</text></root>
SQL> --// we can also copy the entire remote LOB across into a local LOB and use the local one
SQL> declare
2 c clob;
3 pos integer;
4 iterations integer;
5 buf varchar2(20); --// small buffer for demonstration purposes only
6 begin
7 DBMS_LOB.CreateTemporary( c, true );
8
9 pos := 1;
10 iterations := 1;
11 loop
12 buf := ReturnXMLFile@remote_db( 1, pos, 20 );
13 exit when buf is null;
14 pos := pos + length(buf);
15 iterations := iterations + 1;
16 DBMS_LOB.WriteAppend( c, length(buf), buf );
17 end loop;
18
19 DBMS_OUTPUT.put_line( 'Copied '||length(c)||' byte(s) from remote LOB' );
20 DBMS_OUTPUT.put_line( 'Read Iterations: '||iterations );
21 DBMS_OUTPUT.put_line( 'LOB contents (1-4000):'|| DBMS_LOB.SubStr(c,4000,1) );
22
23 DBMS_LOB.FreeTemporary( c );
24 end;
25 /
Copied 53 byte(s) from remote LOB
Read Iterations: 4
LOB contents (1-4000):<root><text>What do you want, universe?</text></root>
PL/SQL procedure successfully completed.
SQL> The concern is the size of the LOB. It does not always make sense to access the entire LOB in the database. What if that LOB is a 100GB in size? Irrespective of how you do it, selecting that LOB column from that table will require a 100GB of data to be transferred from the database to your client.
So you need to decide WHY you want the LOB on the client (which will be the local PL/SQL code in case of dealing with a LOB on a remote database)? Do you need the entire LOB? Do you need a specific piece from it? Do you need the database to first parse that LOB into a more structured data struct and then pass specific information from that struct to you? Etc.
The bottom line however is that you can use remote LOBs. Simply that you cannot use a local pointer variable to point and dereference a remote LOB. -
PR05 Credit card data Buffering issue
Hello,
I have an issue with Credit card data. I guess problem is with document buffering.
Here is the information..
I am using Tcode PR05 (Travel Expense Manager) for US travel & expenses. TRVPA CCC is set to 4.
I have selected credit expense data to trip and when I double click on credit expense type for credit card data, I can see the credit card information. Then I replaced it with non credit expense type and checked for credit card data by double click on item. There is no credit card data, which is correct. Later I have replaced it with Old credit expense type, but when I tried to see credit card data, there is no credit card tab & information. I have debugged the entire code, but not found the solution.
Please help me.
Thankshi,
This works as design. Once you replace the credit card receipt to a non-credit card receipt, the first thing will happen it will delete the credit card information. Now if you replace this non-credit card with a old credit card then the information will
stay the same without the credit card information unless this credit card came from the buffer.
Regards,
Raynard -
Hi Guys,
I have configured HTTP - RFC and i tested with Http cleint tool and i got the sucess return code. I saw in RWB and sxmb_moni and everything is ok.
I have seen the response xml to the third party system and there are some fields where the data is not populated.
what could be the reason and do i need to see the rfc parameters on the sap side in that specific function module.
help would be appreciated
Thanks,
sriniHi Sreenivas,
If the RFC is not returning the values, then what you do is:
Map the node of the target which is having 0 to Unbounded with one of the mandatory parameter (of the source, which will definitely occur). In this way, atleast you will get empty fields but the complete structure as you require in the output. Just, check this one and let me know.
Thanks,
Adithya K
SAP Practise,
[email protected]
Note: Don't forget to reward points if it is useful. -
How to HTTP POST data to SAP Business Connector
Hello,
I would like to transfer data from a client with HTTP POST to SAP Business Connector. SAP BC acts as server. In SAP BC I created a Java service containing the code:
IDataCursor idatacursor = pipeline.getCursor();
idatacursor.first("node");
Object obj1 = idatacursor.getValue();
System.out.println(obj1.toString()); //for test
But how can I access the data that was sent with HTTP POST in my service?
Thank you
Piotr DudzikHi,
quite easy:
StringBuffer buffer = new StringBuffer();
String resultString = null;
String xmlString = "<?xml version=\"1.0\" encoding=\"UTF-8\"?>"+
"<biztalk_1 xmlns=\"urn:biztalk-org:biztalk:biztalk_1\">"+
"<header>"+
"<delivery>"+
"<to>"+
"<address>urn:sap-com:logical-system:XXX</address>"+
"</to>"+
"<from>"+
"<address>urn:sap-com:logical-system:YYY</address>"+
"</from>"+
"</delivery>"+
"</header>"+
"<body>"+
"<doc:Z_RFC_CALL_NAME> xmlns:doc=\"urn:sap-com:document:sap:rfc:functions\" xmlns=\"\">"+
... [PARAMETERS]
"</doc:Z_RFC_CALL_NAME>"+
"</body>"+
"</biztalk_1>";
try {
URL url = new URL(SCHEMA, this.host, Integer.parseInt(PORT), FILE);
HttpURLConnection connection = (HttpURLConnection)url.openConnection();
initConnection(connection);
OutputStream out = connection.getOutputStream();
out.write(xmlString.getBytes());
out.close();
InputStream reader = connection.getInputStream();
char ch;
while((ch = (char)reader.read()) != -1 && ch != 0xFFFF)
buffer.append(ch);
resultString = buffer.toString();
if (this.getXMLEntry(resultString, "E_STATUS").equals("E")) { // ERROR
System.out.println("errormessage: "+this.getXMLEntry(resultString, "E_EMSG"));
} else {
// ok, is supose this is an S (success), parse the stuff
reader.close();
catch (Exception e){
e.printStackTrace();
System.out.println(e); -
It seems that the Date header is not allowed for HTTPService.
I am trying to set the date header for my request but this throws
an error. When I don't try to set the header, it gets set
automatically. The problem is, I need to know the exact value of
that header before I send the request (I am using it to encrypt
some values)
Any thoughts?This document lists the headers that are blocked:
http://kb.adobe.com/selfservice/viewContent.do?externalId=kb403030
I think you can add the ability to pass certain headers by
editing the target server's cross domain policy file. This is
described here:
http://kb.adobe.com/selfservice/viewContent.do?externalId=kb403185&sliceId=2
Does that help?
matt horn
flex docs -
Clearing data buffered by smartforms
Hi,
Is there a way for me to clear data that is already buffered by standard SMARTFORMS program?
Thanks.Hi,
Please be more clear in your question. Do you mean that you want to clear all the data that has been passed to the form interface??
Regards,
Ram.
Maybe you are looking for
-
Smalloc error while doing a ttcmd execute
Hi, I am trying to insert a set of 8 values in the DB. 4 of which are varchar2 and 2 of them are timestamp and 1 number(10) and 1 number(15,4). When I try to insert without setting the params for the number fields in TTCmd::setParam for the two numbe
-
Hi, My educational organisation has all OS X Tiger installed. The hardware will take OS X Lion but I wanted to know the legal way of doing it, I am not bothered about doing a fresh install just the legal way of purchasing Lion. Reading the the EULA i
-
How to display duplicates in Itunes 11
Since I listen to a lot of classical music, accurately synchronizing track titles is very important... And, with an 80 gig Ipod Classic, I cannot get my whole library [95 gigs +] on it at one time...So, with itunes 10, you could display duplicates wh
-
Operations on file: Deleting and get size
Hi, I'm triying to delete a file. I use the following code: File found = new File (directoryOutputFileWriter+f); boolean deleted = found.delete();But when I execute it the boolean deleted it's always false and the file is not deleted.
-
when i check the plugins you send me to the wrong link !!!!