Info on SAP JRA data buffering
Hello,
I'd need help on data buffering used with JRA remote function calls.
This is what is written in the documentation.
● DaysRetention
The number of days the system keeps the data buffer entry
● MaxRetryCount
The maximum number of times you can resubmit requests
● RetryInterval
The number of milliseconds the system waits before resubmitting the query or action request. The scheduler adds one minute to this time.
I'm wondering how to correctly populate these 3 values. It's not clear to me.
Is the data buffering activated only if the communication via RFC is unavailable?
If I'd like MII to repeat the RFC call for maximum ten times every 5 minutes, how can I configure the data buffering accordingly?
Mauro,
Data buffering is for errors in communication as stated in the first sentence under the Use heading in the help. You are interested in the MaxRetryCount and RetryInterval parameters. I am not sure if your situation calls for changing the DaysRetention parameter, the default is 7 days.
So...
MaxRetryCount = 10
RetryInterval = 5 min (5 * 60* 1000) = 300,000 ms
Or if the Scheduler's extra minute throws you off, use 4601000 = 240,000 ms
Regards,
Kevin
Similar Messages
-
Need Info on RDA-enabled data source based on FM
Hi,
I need Info on RDA-enabled data source based on Function Module.
How to implement it?
Thanks & Regards,
Rashmi.Hi Rashmi
Check this link
http://help.sap.com/saphelp_nw70/helpdata/EN/52/777e403566c65de10000000a155106/frameset.htm
[under tab Tranferring Transaction Data from Source Systems (RDA)]
http://help.sap.com/saphelp_nw70/helpdata/EN/3f/548c9ec754ee4d90188a4f108e0121/frameset.htm
Regards
Jagadish -
Transitions - Core changes in SAP Master data managment
Hi,
I have been hearing about SAP's Master data management transition to latest shopment version 7.1.
Could anyone enlist/share the core changes it would have with reference to its current version?
regards,
bharatkumarHi Bharatkumar and all readers,
Recently i have been going through one webinar available on SDN (on the home page of Master Data Management) whcih - i am sure - would answer your query.
Please visit - https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/d0d0b0e7-f144-2b10-17a2-982103c96ac0
In case you find any trouble in locating the link - Visit SDN > Information Management > MDM : Link to the webinar is right there on home page.
I am sur this wud help all the readers seekign info about SAP MDM 7.1
Regards,
Krutarth
Edited by: Krutarth Amish Vasavada on Aug 6, 2008 12:54 PM -
SAP Accelarated Data migration
Hi All,
Could someone kindly provide more info about SAP ADM. I am uanble to get any info about the same. I would like to understand as to how exactly the tool works. I have the 4 page PDF that is posted on the site but its not clear on the actual tool. Can someone kindly provide me with some document or screen shots or any more info about it. My mail id is [email protected] Could someone kindly reply at the earliest.
Thanks
PrahladHi Prahlad,
Go through this hope u can understand.
With SAP Accelerated Data Migration, you can reduce migration
costs by as much as 50% and avoid interruption of business
processes. Moreover, shutting down the source system after
migration reduces system administration costs and total cost
of operations. In short, you realize the following benefits:
Significantly reduced cost and time to complete migration
projects
Accurate, cost-effective data transfer applicable for any kind
of source system
Better data quality because of preconfigured business objects
that ensure data consistency
Improved end-user productivity and acceptance thanks to
migration of historical data
Effective migration that avoids interruption of business
processes
Full support services to avoid risks and ensure the optimum
performance of your new business applications
Faster return on investment
In short, a smoother, more cost-effective migration to a new
technology solution ultimately positions your organization
to lower your total cost of ownership, maintain competitive
advantage, and pursue new business opportunities.
Expertise in Action
SAP Accelerated Data Migration applies a business objectoriented,
two-step approach that uses a neutral interface as a staging area
and predefined migration content for the conversion and upload
of data. The neutral interface enables the SAP tool to generate
predefined migration content and prevents all potential legal
issues regarding the intellectual property of any source-system
vendor. The whole data migration process from the source to
the target system consists of just two steps, as follows:
1. Data is extracted from the source system into the standard
interface as XML files.
2. Data migrates from the interface into the mySAP Business
Suite database. The migration is based on a new migration
workbench engine developed by SAP based on the SAP
NetWeaver® platform. All requirements for mapping structures
and fields and developing complex conversion rules are solved
within this engine (see Figure 1).
Once the migration is complete, business-unit end users have
access to all the legacy data in the new applications as if it had
originated there. They can continue to work on the existing
business process items in the new applications and benefit from
improved functionality.
Lifting the Limitations
Much of the cost and effort involved in classical data migrations
are generated by migration content development, as follows:
Identifying business objects for migration to properly support
the business
Defining the structure and field mapping for the relevant
business objects
Developing conversion rules for all necessary value mapping
Readily available migration content can simplify this effort.
SAP Accelerated Data Migration provides preconfigured business
content, helping you migrate it to your new system more efficiently
and rapidly. The tool allows the migration of all types of
data, independent of its current state within a business process.
This includes master and dynamic data, as well as partially
processed and historical data, to minimize data loss. Business
processes are uninterrupted and normal operation procedures
can be retained.
By providing a standard, neutral interface and reading data as an
XML file, SAP Accelerated Data Migration is applicable for any
kind of source system. Preconfigured data migration objects built
specifically for SAP applications significantly simplify the conversion
from non-SAP software data into SAP software data objects,
yielding far-reaching benefits. Besides reducing related IT costs,
you can be certain of consistency across business-object boundaries.
Through a direct insert into the database, you avoid the
performance limitations of classical data migration.
Reward points if helpful.
Thanks -
SAP BO Data Services XI 3.2 - Cannot Handle Multithreaded RFC Connection?
Hi Guys,
Just want to ask for your inputs if Data Services cannot handle multiple RFC connection request to BW system?
The scenario is:
There is one BODI job using RFC connection and trigger the 2nd job at the same time and it happen that the 2nd job failed.
Current version of SAP BO Data Services XI 3.2 that we are using is 12.2.2.1
Thanks in advance,
RandellArpan,
One way to get to the multiprovider data is to use Open Hub with a DTP that gets the data from the multiprovider and exposes it as an open hub destination to Data Services. With Data Services XI 3.2 we now fully support Open Hub where Data Services will (1) start the process chain to load the data (2) read the data when process chain ended and (3) notify Open Hub when done so that the data can be purged again.
More info on Open Hub here : http://help.sap.com/saphelp_nw04/helpdata/en/1e/c4463c6796e61ce10000000a114084/content.htm
But I will also look into the why we show the multiproviders when browsing the metadata, but get an error when trying to extract using the ABAP method (not via Open Hub). You could be right in your assumptions below and we might just need to hide the multiproviders when browsing metadata.
Thanks,
Ben.
Edited by: Ben Hofmans on Jan 5, 2010 6:06 PM - added link to Open Hub documentation which references multiproviders as possible source. -
Determining File Name in Info Package under External Data
Determining File Name in Info Package under External Data
I am on SAP BW 3.0. A System is sending a flat file every few days
With a date time stamp, e.g., d:\loaddar\file_20080212_122300.csv
I know in Info Package one can create routine under external data to determine the file name. I have seen
Examples where people determine file name based on date. Since my file has a time stamp, what code I write a to pick the file. Is there a way to read one or more files and
Determine file name.
I am new to SAP BW and ABAP. However, I have lot of experience with Oracle and Java.
Can someone point me how this will be done. I am looking for some sample code as well.
Thanks in advance. I will really appreciate your help.Hello Prem,
Even i used to get the file suffix with date & time, and i found very difficult to pick up the file from application server using routine in infopackge. Then i asked to change to date only and it was easy to pick the file using routine. But i think in your case files are coming more than once, in such a case you should write a small unix script to add these files and then convert into single file with date only and execute the infopackage to load it.
Cheers!
Sanjiv -
Can Info Object be a Data Target
Hi Gurus
Can a Info Object be a data target. Would appreciate a explanation on this.
GSRHi S R G,
You can indicate an InfoObject of type characteristic as a data target / InfoProvider if it has attributes and/or texts. The data is then loaded into the master data tables using the update rules.
when we check the option "characteristic as infoprovider", it creates three tables, suppose InfoObject name is TTT then::
1. /BI0/MTTT - Master data Table
2. /BI0/PTTT - Master Data Table (Time Independent)
3. /BI0/XTTT - SID Table.
Data store in Master Data Table and when we delete the request data wo'nt delete, for that we have to drop the master data table in SE14 transaction.
http://help.sap.com/saphelp_nw04/helpdata/en/c4/b007720ae4c248b945bb16f24bba31/frameset.htm
Thanks -
RFC Assign Link As XML Multiple Input Error SAP JRA Function Call
I am using SAP MII 12.1.0 (Build 201)
I have a problem with SAP JRA Function Call Action in Link Assignment as assign XML.
I need to assign multiple input but from local xml property to my RFC table as assign XML
but when i am going to execute my transaction i am getting following errors.
[WARN] Data buffer filter for this action does not exist or the interface is incorrect.
[ERROR] Unable to make RFC call Exception: [com.sap.conn.jco.JCoException: (104) RFC_ERROR_SYSTEM_FAILURE: Transferred IDoc data records are empty (internal error) ]
[WARN] [SAP_JRA_Function_Upload_Material_Consumption] Skipping execution of output links due to action failure.
[ERROR] [UserEvent] : com.sap.conn.jco.JCoException: (104) RFC_ERROR_SYSTEM_FAILURE: Transferred IDoc data records are empty (internal error)
Please help me out in this issue.
Regards,
Manoj BilthareDear Jeremy
Thanks for reply
My Problem is got solved that problem is due to version problem now the MII version is Version 12.1.4 Build(46).
Regards,
Manoj Bilthare -
Info-object Maintenance -- Master Data/Texts Tab options
Hello Bi Gurus,
In BI 7.o:
Path:
Info-object Maintenance -->Master Data/Texts Tab -->Master Data infoSource/Data Taget/Infoprovider/Master Data Read Access.
Scenario 1:
1. Master Data Access - Own Implementation
Name of Master Data Access -
CL_RSMD_RS_SPEC class for accessing generic Bw special info objects
CL_RSR_XLS_TABF4 Table Master Data Read Class
CL_RSROA_LOCAL_MASTERDATA Master Data for Virtual Characteristics
CL_RSMD_RS_BW_SPEC Base class for accessing Bw special info objects
CL_RSMD_RS_SPEC_TXT Class for Accessing Generic BW-Specific InfoObjects
Under what scenairo we go for this option?
Any body used this option?
Scenario 2:
1. Master Data Access - Direct Access
Name of Master Data Access - CL_RSR_REMOTE_MASTERDATA
Under what scenairo we go for this option?
Any body used this option?Hi,
With Master Data:
If you set this indicator, the characteristic may have attributes. In this case the system generates a P table for this characteristic. This table contains the key of the characteristic and any attributes that might exist. It is used as a check table for the SID table. When you load transaction data, there is check whether there is a characteristic value in the P table if the referential integrity is used.
With Maintain Master Data you can go from the main menu to the maintenance dialog for processing attributes.
The master data table can have a time-dependent and a time-independent part.
In attribute maintenance, determine whether an attribute is time-dependent or time independent.
With Texts:
Here, you determine whether the characteristic has texts.
If you want to use texts with a characteristic, you have to select at least one text. The short text (20 characters) option is set by default but you can also choose medium-length texts (40 characters) or long texts (60 characters).
Helpfull link:
http://help.sap.com/saphelp_nw2004s/helpdata/en/71/f470375fbf307ee10000009b38f8cf/frameset.htm
Regards,
Suman -
SAP standard data sources for 0SD_C03 sales over view cube
Hi All,
Please any body can be give info on what are the SAP SD data sources connected to 0SD_C03 infocube
Regards
prasadHi,
Chekc in SAP hlep site...
Using the datasources are depends on your business.
rahuleaswar.pbworks.com/f/DesignSpec-+0SD_C03.doc
www.sembps.com/documentation/BW/SD/8_sales_analysis.doc
http://help.sap.com/saphelp_nw70/helpdata/en/90/d4f33b949b6b31e10000000a11402f/frameset.htm
Thanks
Reddy -
Hi there,
I am trying to call BAPI methods on an ECC server from inside an Java EE server (JBoss AS). I can use JCO to accomplish this, and I am interested in using connection pools. I do know that JCO has built-in mechanisms in handling this (JCO connection pools), but upon reading the SAP JRA, I thought that that would be a much more elegant and proper solution, given that it complies with JCA. Can JRA be used on JBoss? If anyone can point me to useful information on integrating this with JBoss and some code examples it would be much appreciated. Thanks!hi jose,
we can integrate the portal of jboss in one side and sap ep in other side and the we can make the connection as two way by using fedarated network portal. jboss portal user can make the request from jboss portal he can get the data from sap portal
http://www.jboss.org/jbossportal/
https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/97171dd3-0401-0010-5195-b43f556e6ce9?overridelayout=true
let me know u need any further information
bvr -
Downloding the SAP master data and transaction data to a flat file
Hello All,
Is there any SAP standard method or transaction to download the SAP master data and transaction data to flat file.
With out using ABAP development, SAP had provided any tool or method to download the SAP system master and transaction data to flat file.
Thanks,
Feroz.hi
as of now up to my knowledge no. -
How to save info in a meta-data of a jpg file?
hi, i need to know how to save info in a meta-data of a jpg file:
this is my code (doesn't work):
i get an exception,
javax.imageio.metadata.IIOInvalidTreeException: JPEGvariety and markerSequence nodes must be present
at com.sun.imageio.plugins.jpeg.JPEGMetadata.mergeNativeTree(JPEGMetadata.java:1088)
at com.sun.imageio.plugins.jpeg.JPEGMetadata.mergeTree(JPEGMetadata.java:1061)
at playaround.IIOMetaDataWriter.run(IIOMetaDataWriter.java:59)
at playaround.Main.main(Main.java:14)
package playaround;
import java.io.*;
import java.util.Iterator;
import java.util.Locale;
import javax.imageio.*;
import javax.imageio.metadata.IIOMetadata;
import javax.imageio.metadata.IIOMetadataNode;
import javax.imageio.plugins.jpeg.JPEGImageWriteParam;
import javax.imageio.stream.*;
import org.w3c.dom.*;
public class IIOMetaDataWriter {
public static void run(String[] args) throws IOException{
try {
File f = new File("C:/images.jpg");
ImageInputStream ios = ImageIO.createImageInputStream(f);
Iterator readers = ImageIO.getImageReaders(ios);
ImageReader reader = (ImageReader) readers.next();
reader.setInput(ImageIO.createImageInputStream(f));
ImageWriter writer = ImageIO.getImageWriter(reader);
writer.setOutput(ImageIO.createImageOutputStream(f));
JPEGImageWriteParam param = new JPEGImageWriteParam(Locale.getDefault());
IIOMetadata metaData = writer.getDefaultStreamMetadata(param);
String MetadataFormatName = metaData.getNativeMetadataFormatName();
IIOMetadataNode root = (IIOMetadataNode)metaData.getAsTree(MetadataFormatName);
IIOMetadataNode markerSequence = getChildNode(root, "markerSequence");
if (markerSequence == null) {
markerSequence = new IIOMetadataNode("JPEGvariety");
root.appendChild(markerSequence);
IIOMetadataNode jv = getChildNode(root, "JPEGvariety");
if (jv == null) {
jv = new IIOMetadataNode("JPEGvariety");
root.appendChild(jv);
IIOMetadataNode child = getChildNode(jv, "myNode");
if (child == null) {
child = new IIOMetadataNode("myNode");
jv.appendChild(child);
child.setAttribute("myAttName", "myAttValue");
metaData.mergeTree(MetadataFormatName, root);
catch (Throwable t){
t.printStackTrace();
protected static IIOMetadataNode getChildNode(Node n, String name) {
NodeList nodes = n.getChildNodes();
for (int i = 0; i < nodes.getLength(); i++) {
Node child = nodes.item(i);
if (name.equals(child.getNodeName())) {
return (IIOMetadataNode)child;
return null;
static void displayMetadata(Node node, int level) {
indent(level); // emit open tag
System.out.print("<" + node.getNodeName());
NamedNodeMap map = node.getAttributes();
if (map != null) { // print attribute values
int length = map.getLength();
for (int i = 0; i < length; i++) {
Node attr = map.item(i);
System.out.print(" " + attr.getNodeName() +
"=\"" + attr.getNodeValue() + "\"");
Node child = node.getFirstChild();
if (child != null) {
System.out.println(">"); // close current tag
while (child != null) { // emit child tags recursively
displayMetadata(child, level + 1);
child = child.getNextSibling();
indent(level); // emit close tag
System.out.println("</" + node.getNodeName() + ">");
} else {
System.out.println("/>");
static void indent(int level) {
for (int i = 0; i < level; i++) {
System.out.print(" ");
}Hi,
Yes, you need store data to table, and fetch it when page is opened.
Simple way is create table with few columns and e.g. with CLOB column and then create form based on that table.
Then modify item types as you like, e.g. use HTML editor for CLOB column
Regards,
Jari -
SAP BW data in ODS to XML and sending this XML file to 3rd Party
Hi Gurus,
We are having a scenario in which we have to convert our data in ODS to XML and need to provide this XML file to our clients so that they can use this XML file in their 3rd party system.
Now the issue is that i have created ABAP program for converting into XML.
If I execute this program, since i have given path as my Desktop, the converted XML file gets saved on my Desktop.
But the problem is how i can provide this XML file to the client.
Is there any way of converting this XML file to html and send them the URL ......
plz suggest me.... what can be done......
my ABAP code u can see in the following link
Extract SAP BW Data into XML
thanks and regards,
P.Madhusudhan RajuHi,
Pls go through the link below it may help you
http://www.sdn.sap.com/irj/scn/index;jsessionid=(J2EE3417100)ID1537206350DB01670597797729934632End?rid=/library/uuid/8c10aa90-0201-0010-98a0-f071394bc9ae&overridelayout=true
Regards,
Marasa. -
I have a VI that writes to a network shared variable using DataSocket. The DataSocket URL uses PSP. I have another VI that reads the network shared variable also using DataSocket. I am experimenting with data buffering to see when data is lost if the Writer VI writes faster than the Reader VI. Is data buffered using DataSocket with PSP in the URL? If not, I expect data will be lost. If it is buffered, I don't expect data to be lost until the buffer is full or overflows.
Attached is a project with the network shared variables and the Writer and Reader VI. VIs compare reading and writing directly using a shared variable node and using DataSocket. With DataSocket, I am experiencing data loss as if there is no buffering. When using the shared variable node, I do not see data loss. Run the Reader.vi. It will read two network shared variables every two seconds. One variable is read using DataSocket and one is read using a variable node. Next, run the Writer.vi. It will write to two network shared variables every 0.5 seconds. One variable is written using DataSocket and one is written using a variable node. Since the Writer VI is writing four times as fast as the Reader VI data will need to be buffered to avoid data loss. Monitor the Buffered Sequence and BufferedDS Sequence front panel indicators in the Reader VI. Buffered Sequence is data from the variable node. BufferedDS Sequence is data from the DataSocket Read.
Solved!
Go to Solution.
Attachments:
Net Share Var & DataSocket.zip 49 KBDoes PSP in the DataSocket URL change the data buffering? Attached is a page from 'LabVIEW 8.5.1 help/fundamentals/networking in LabVIEW/concepts/choosing among LabVIEW communication features' mentioning lossless data transmission for DataSocket with psp protocol(2nd row in table). Does lossless data indicate one packet will be guarantied to be sent from the writer and received by the reader; or, does it provide the guaranty with additional packets buffered?
Attachments:
LabVIEW Communication Features.pdf 61 KB
Maybe you are looking for
-
Is there a terminal emulator with up- down output?
The idea just hit me but it's very hard to find something in search or on google (up and down are popular keywords...). Here's my idea: the terminal I use (urxvt) and every other I heard of always outputs text sort of 'upwards'. Meaning: the prompt l
-
Dear All, I am getting error 'Numeric value deviates from legal range (ODBC -1015) [131-183]' while adding new series, Series have 10 digit numbers. If number starts with '2' its accepting but if starts with 3 or 4 giving above mentioned error. Vers
-
Send pdf file in an email by c# emailer, also sending a completed pdf file to server
I am trying to achieve two things: First I want to have a submit button on my pdf that sends the completed pdf to a asp.net C# file that sends the completed pdf file, as an attachment to a hard coded email address by a mailer, which i have already cr
-
Please, please help me. I'm new to nokia and need ...
Hey guys! I just got myself an N73 but my kitty chewed through the data cable. So I went out and bought myself a new cable, (DKU-5). The pc suit now wants me to connect with cable, so I plug all in etc, but then a message on my phone ops up stating "
-
Query to display one row per group based on highest value
I have the following table and I want to be able to create a query that displays only the highest number based on a group. (see below) Acode aname anumber a Jim 40 a Jim 23 a Jim 12 b Sal 42 b Sal 12 b Sal 3 Acode aname anumber a Jim 40 b Sal 42