Long runtime for loading PCA data into PCA ODS
hi all,
hope you can help me regarding this problem:
i have been trying to load the PCA ODS (standard) from R/3 but it is taking too long so i tried to limit the size by using key fields but no luck. i have tried to load to PSA even that is taking long for around 53440 records i am unable to figure out the problem, i have checked all the transfer rules and update rules every thing is fine.
can anyone please help me
thanx a lot in advance
Hi there,
don't know if you're loading line items, but following recommendation from SAP shortened our delta loads from 1.5 - 2 hours down to 2 minutes:
To optimize the performance of data procurement when DataSource 0EC_PCA_3 accesses line item table GLPCA in the OLTP, you should set up an additional secondary index. This should contain the following fields:
· RCLNT
· CPUDT
· CPUTM
Kind regards
/martin
Similar Messages
-
Hi,
I've to extract big volume of data (millions of records) into an ODS. But the problem is, I cannot give selections in the infopackage as there are no fields in data selection tab.
If I extract all the data, then ods activation would definately fail. Is there any of doing this?
Many Thx.I think that with Siggi's post you can have another version of the same suggestion and maybe you are better understanding..
anyway, if you want to split your load in several steps (in order to avoid a unique request with a lot of records) go to RSA6, search your datasource, check 'selection' column for a certail field, activate, replicate source system and then you will find your field available for selection in IP !
But I suggested also to check why you have an activation error, since it's not a normal situation: you can easily load one million record in the active table of your ODS...investigate about the reason and try to remove the cause !
Hope now is clearer...
Bye,
Roberto -
Using VBA for loading Query data into Excel workbook
Hi all.
I want simply load data from BEx query into Excel Wortksheet using VBA because of report formats and footer section at the end of the results.
Any code examples, tutorials, comments, suggestions will be regarded.
thanx in advance,
GediminasThe difficalty is that I don't know the number of rows report will return. And I need my footer only on LAST page of workbook.
Another thing I can't imagine how to do by using standart BEx functionality is to design complex column header set (merged columns, sub-columns and etc.). -
Option to load XML data into BI 70
Hi All,
I have read some postings on forums and also have read the "HOW TO load XML data in BW" guide. I am confused when I read the SAP online help documentation for loading XML data .
Question : Out of the 3 options listed below which option is best suited for loading XML data into BW . XML file is going to be provided by 3rd party company ?
Thanks and appreciate any input .
Smith .
Here is what I found for BI 70 EHP1 help :
http://help.sap.com/saphelp_nw04s/helpdata/en/b2/e50138fede083de10000009b38f8cf/frameset.htm
Data is generally transferred into SAP BW by means of a data request, which is sent from SAP BW to the source system (pull from the scheduler). You can also send the data to SAP BW from outside the system. This is a data push into SAP BW.
A data push is possible for various scenarios:
● Transferring Data Using the SOAP Service SAP Web AS
● Transferring Data Using Web Services
● Transferring Data Using SAP XI
In all three scenarios, data transfer takes place using transfer mechanisms that are sufficient for Simple Object Access Protocol (SOAP); the data transfer is also XML-based.
The SOAP-based transfer of data is only possible for flat structures. You cannot transfer hierarchy data.Hi,
I feel you can go with 2nd option. But still wait for some more inputs from Experts........
Regards,
Suman -
ASCP Test for Load Legacy Data
Through to the legacy Web Adi ASCP Data in the Test of the load.
Plan to check more than one, but the result is not correct that there is cause msc_operation_components not exists.
Questions 1: msc_operation_components Table also through the Upload Web Adi BOM_Component and Routing_operation Data should be used or whether the Pre-Process Monitor Program or other Concurrent Program please tell us what is produced?
Questions 2: If you create a Data Collect flat file data self service if you need through the OA Template Upload for msc_operation_components Dat there is no there, what should I do??I think you'll find the white paper published on metalink, our main avenue for publishing documentation and providing support information to customers.
<BLOCKQUOTE><font size="1" face="Verdana, Arial">quote:</font><HR>Originally posted by JohnWorthington:
Hello,
The 11.0.2 implementation doc mentions a technical "White Paper about the Data Pump" for loading legacy data into HRMS.
I cannot find this White Paper anywhere.
Does it exist?
What is the easiest way to load our legacy data into 11.0.3 HRMS??
Thanks,
John Worthington
[email protected]<HR></BLOCKQUOTE>
null -
Error in Loading FIGL data into a standard ODS
Hi,
I have encountered an error when i attempted to load FIGL data into a standard ODS. Below is the error i received:
Error message during processing in BI
Diagnosis
An error occurred in BI while processing the data. The error is documented in an error message.
System Response
A caller 01, 02 or equal to or greater than 20 contains an error meesage.
Further analysis:
The error message(s) was (were) sent by:
Second step in the update
Second step in the update
Second step in the update
Second step in the update
Procedure
Check the error message (pushbutton below the text).
Select the message in the message dialog box, and look at the long text for further information.
Follow the instructions in the message.
Initially i discovered a work around wherein i loaded the data into a write optimize ODS. It was successful. However, when i attempted to data mart the data and put it into a standard ODS, the similar error occurred.
Please help me with this problem.
RamonHi,
Hi,
I will suggest you to check a few places where you can see the status
1) SM37 job log (In source system if load is from R/3 or in BW if its a datamart load) (give request name) and it should give you the details about the request. If its active make sure that the job log is getting updated at frequent intervals.
2) SM66 get the job details (server name PID etc from SM37) and see in SM66 if the job is running or not. (In source system if load is from R/3 or in BW if its a datamart load). See if its accessing/updating some tables or is not doing anything at all.
3) RSMO see what is available in details tab. It may be in update rules.
4) ST22 check if any short dump has occured.(In source system if load is from R/3 or in BW if its a datamart load)
Once you identify you can rectify the error.
If all the records are in PSA you can pull it from the PSA to target. Else you may have to pull it again from source infoprovider.
If its running and if you are able to see it active in SM66 you can wait for some time to let it finish. You can also try SM50 / SM51 to see what is happening in the system level like reading/inserting tables etc.
If you feel its active and running you can verify by checking if the number of records has increased in the new data table.
Thanks,
JituK -
DTP running long time to load master data due to lock on SID table
Hi,
we are facing a problem while DTP loading master data into targets(Info Objects).
It is taking long time to load data due to below error.
18.03.2014 01:03:23 All records forwarded RSM2 730 S
18.03.2014 01:13:39 EXTRACTION OF DATAPACKAGE 000009 RSAR 051 S
18.03.2014 01:13:40 All records forwarded RSM2 730 S
18.03.2014 01:24:50 EXTRACTION OF DATAPACKAGE 000010 RSAR 051 S
18.03.2014 01:24:51 All records forwarded RSM2 730 S
18.03.2014 01:38:34 EXTRACTION OF DATAPACKAGE 000011 RSAR 051 S
18.03.2014 01:38:41 All records forwarded RSM2 730 S
18.03.2014 01:52:10 EXTRACTION OF DATAPACKAGE 000012 RSAR 051 S
18.03.2014 01:52:10 All records forwarded RSM2 730 S
18.03.2014 02:06:51 EXTRACTION OF DATAPACKAGE 000013 RSAR 051 S
18.03.2014 02:06:53 All records forwarded RSM2 730 S
18.03.2014 02:22:32 EXTRACTION OF DATAPACKAGE 000014 RSAR 051 S
18.03.2014 02:22:33 All records forwarded RSM2 730 S
18.03.2014 02:38:36 EXTRACTION OF DATAPACKAGE 000015 RSAR 051 S
18.03.2014 02:38:37 All records forwarded RSM2 730 S
18.03.2014 02:55:25 EXTRACTION OF DATAPACKAGE 000016 RSAR 051 S
18.03.2014 02:55:25 All records forwarded RSM2 730 S
18.03.2014 03:13:56 SQL: 18.03.2014 03:13:56 ALEREMOTE DBMAN 099 I
18.03.2014 03:13:56 TRUNCATE TABLE "/BI0/0600000066" DBMAN 099 I
18.03.2014 03:13:56 SQL-END: 18.03.2014 03:13:56 00:00:00 DBMAN 099 I
18.03.2014 03:13:56 SQL: 18.03.2014 03:13:56 ALEREMOTE DBMAN 099 I
18.03.2014 03:13:56 LOCK TABLE "/BI0/0600000066" IN EXCLUSIVE MODE DBMAN 099 I
18.03.2014 03:13:56 NOWAI DBMAN 099 I
18.03.2014 03:13:56 SQL-END: 18.03.2014 03:13:56 00:00:00 DBMAN 099 I
18.03.2014 03:13:58 EXTRACTION OF DATAPACKAGE 000017 RSAR 051 S
18.03.2014 03:13:59 All records forwarded RSM2 730 S
18.03.2014 03:34:24 EXTRACTION OF DATAPACKAGE 000018 RSAR 051 S
18.03.2014 03:34:25 All records forwarded RSM2 730 S
18.03.2014 03:56:03 EXTRACTION OF DATAPACKAGE 000019 RSAR 051 S
18.03.2014 03:56:03 All records forwarded RSM2 730 S
18.03.2014 04:19:55 EXTRACTION OF DATAPACKAGE 000020 RSAR 051 S
18.03.2014 04:19:56 All records forwarded RSM2 730 S
18.03.2014 04:44:07 SQL: 18.03.2014 04:44:07 ALEREMOTE DBMAN 099 I
18.03.2014 04:44:07 TRUNCATE TABLE "/BI0/0600000068" DBMAN 099 I
18.03.2014 04:44:09 SQL-END: 18.03.2014 04:44:09 00:00:02 DBMAN 099 I
18.03.2014 04:44:09 SQL: 18.03.2014 04:44:09 ALEREMOTE DBMAN 099 I
18.03.2014 04:44:09 LOCK TABLE "/BI0/0600000068" IN EXCLUSIVE MODE DBMAN 099 I
18.03.2014 04:44:09 NOWAI DBMAN 099 I
18.03.2014 04:44:09 SQL-END: 18.03.2014 04:44:09 00:00:00 DBMAN 099 I
18.03.2014 04:44:12 EXTRACTION OF DATAPACKAGE 000021 RSAR 051 S
18.03.2014 04:44:13 All records forwarded RSM2 730 S
For this long running data load we didnt find any short dumps and no locks available on SM12 also.
Please help on this.
Thanks,
Sankar.Can we know what kind of data source is this. attribute or text data source.
what is data volume? its less or more?
Actually we load direct access data thru dtp(No psa) into info objects for text data sources where have less data only.
While running your dtp is any attribute change run for the same info object ot any transactional loads are happening?
Is this happens on particular day or weekend or anydays?
Thanks -
Error While loading the data into PSA
Hi Experts,
I have already loaded the data into my cube. But it doesn't have values for some fields. So again i modified the data in the flat file and tried to load them into PSA. But while starting the Info Package, I got an error saying that,
"Check Load from InfoSource
Created YOKYY on 20.02.2008 12:44:33
Check Load from InfoSource , Packet IP_DS_C11
Please execute the mail for additional information.
Error message from the source system
Diagnosis
An error occurred in the source system.
System Response
Caller 09 contains an error message.
Further analysis:
The error occurred in Extractor .
Refer to the error message.
Procedure
How you remove the error depends on the error message.
Note
If the source system is a Client Workstation, then it is possible that the file that you wanted to load was being edited at the time of the data request. Make sure that the file is in the specified directory, that it is not being processed at the moment, and restart the request.
Pls help me in this......
With Regards,
Yokesh.Hi,
After editing the file did you save the file and close it.
This error may come if your file was open at the time of request.
Also did you check the file path settings.
If everything is correct try saving the infopackage once and loading again.
Thanks,
JituK -
What are the steps for loading master data
Hello
what are the steps for loading master data? i want to learn about loading all master data and the steps to choose the best way to load the data.
if anyone has documents please send me the documents i will be really greatful
[email protected] thanks everyone
EvionHi Heng,
Download the data into a CSV file.
Write a program using GUI_UPLOAD to upload the CSV file and insert records.Chk the below link for example
http://www.sap-img.com/abap/vendor-master-upload-program.htm
Reward Points for the useful solutions.
Regards,
Harini.S -
Caller-70 Error while loading master data into infoobject
hi ,
I am getting following error while loading master data into infoobject (0tb-account). I am loading this master data in production environment for the first time. there are about 300000 records. All have got loaded upto PSA. Infopackage settings were PSA and then into data target.
Short dump in the Warehouse
Diagnosis
The data update was not finished. A short dump has probably been logged in BI. This provides information about the error.
System Response
"Caller 70" is missing.
ST22 dump analysis is as below:
Termination occurred in the ABAP program "GP476CZYBEF2WX53UZ8TXFG6XOS" - in
"VALUE_TO_SID_CONVERT_DB".
The main program was "RSMO1_RSM2 ".
Please help as soon as you can..Production problem....
Regards
RakeshHi rakesh,
May be IDOCs not processed completely,
Idoc Problem, Either wait till time out & process Idoc from detail monitor screen, or go to BD87 & process Idoc with status = YELLOW ( be careful while processing IDOCS from BD87, choose only relevant Idocs
Cheers
Raj -
Best method to load XML data into Oracle
Hi,
I have to load XML data into Oracle tables. I tried using different options and have run into a dead end in each of those. I do not have knowledge of java and hence have restricted myself to PL/SQL solutions. I tried the following options.
1. Using DBMS_XMLSave package : Expects the ROWSET and ROW tags. Connot change format of the incoming XML file (Gives error oracle.xml.sql.OracleXMLSQLException: Start of root element expected).
2. Using the XMLPARSER and XMLDOM PL/SQL APIs : Works fine for small files. Run into memory problems for large files (Gives error java.lang.OutOfMemoryError). Have tried increasing the JAVA_POOL_SIZE but does not work. I am not sure whether I am changing the correct parameter.
I have read that the SAX API does not hog memory resources since it does not build the entire DOM tree structure. But the problem is that it does not have a PL/SQL implementation.
Can anyone PLEASE guide me in the right direction, as to the best way to achieve this through PL/SQL ??? I have not designed the tables so am flexible on using purely relational or object-relational design. Although would prefer to keep a purely relational design. (Had tried used object-relational for 1. and purely relational for 2. above)
The XML files are in the following format, (EXAMINEEs with single DEMOGRAPHIC and multiple TESTs)
<?xml version="1.0"?>
<Root_Element>
<Examinee>
<MACode>A</MACode>
<TestingJID>TN</TestingJID>
<ExamineeID>100001</ExamineeID>
<CreateDate>20020221</CreateDate>
<Demographic>
<InfoDate>20020221</InfoDate>
<FirstTime>1</FirstTime>
<LastName>JANE</LastName>
<FirstName>DOE</FirstName>
<MiddleInitial>C</MiddleInitial>
<LithoNumber>73</LithoNumber>
<StreetAddress>SomeAddress</StreetAddress>
<City>SomeCity</City>
<StateCode>TN</StateCode>
<ZipCode>37000</ZipCode>
<PassStatus>1</PassStatus>
</Demographic>
<Test>
<TestDate>20020221</TestDate>
<TestNbr>1</TestNbr>
<SrlNbr>13773784</SrlNbr>
</Test>
<Test>
<TestDate>20020221</TestDate>
<TestNbr>2</TestNbr>
<SrlNbr>13773784</SrlNbr>
</Test>
</Examinee>
</Root_Element>
Thanks for the help.Please refer to the XSU(XML SQL Utility) or TransX Utility(for Multi-language Document) if you want to load data in XML format into database.
Both of them require special XML formats, please first refer to the following docs:
http://otn.oracle.com/docs/tech/xml/xdk_java/doc_library/Production9i/doc/java/xsu/xsu_userguide.html
http://otn.oracle.com/docs/tech/xml/xdk_java/doc_library/Production9i/doc/java/transx/readme.html
You can use XSLT to transform your document to the required format.
If you document is large, you can use SAX method to insert data into database. But you need to write the code.
The following sample may be useful:
http://otn.oracle.com/tech/xml/xdk_sample/xdksample_040602i.html -
Step- by- Step on How to Load Excel data into Crystal Reports?
Hi Friends,
Can anyone send me a Step- by- Step on How to Load Excel data into Crystal Reports? Pls help me. Thanks in Advance.
VijayIt's also important to 'prep' the excel file prior to connecting to it.
Give the data tab a meaningful name
Make sure the column headers are unique and that every column has a header
Delete any blank tabs
If you have trouble with Excel changing the data type of a field (say, a social security number you want to be a string value rather than a number so you don't lose leading zero) an alternative would be to save the spreadsheet file as a CSV, create a schema.ini to specify the data types for each column (example below) and use the same steps to connect except instead of choosing Excel 8.0, scroll to the bottom and choose Text. You have to make sure the CSV file is in the same folder as the schema.ini file that defines the columns.
Schema.ini example:
200912PUSD.csv
ColNameHeader=True
Format=CSVDelimited
MaxScanRows=25
CharacterSet=OEM
Col1=SSN Char Width 9
Col2=LAST_NM Char Width 25
Col3=FIRST_NM Char Width 25
Col4=DOB Date
Col5=STDNT_ID Char Width 10
Col6=SORTKEY Char Width 10
Col7=SCHOOL_NM Char Width 30
Col8=OTHER_ID Integer
Col9=GRADE Char Width 2
The filename in the first line needs to have the [] brackets around it, but I couldn't get it to display in this forum correctly.
Edited by: Eric Petersen on Jan 27, 2010 9:40 AM -
XML files imported into SPRC are not properly loading associated data into SPRC
Hi EHS Gurus,
My Issue is - XML files imported into SPRC are not properly loading associated data into SPRC .
When am uploading the XML files manually its working fine in /TDAG/CPM00.
The same when we are uploading the XML files thru basis program IPC (XML) Documents into SAP , its not working .In XMl file it showing its complaint the same time compliance workbench complaint statues it showing empty .
for you reference pls see the attached screenshot .
Pls can you help me on this .
Regards,
Suresh.Hi EHS Gurus,
My Issue is - XML files imported into SPRC are not properly loading associated data into SPRC .
When am uploading the XML files manually its working fine in /TDAG/CPM00.
The same when we are uploading the XML files thru basis program IPC (XML) Documents into SAP , its not working .In XMl file it showing its complaint the same time compliance workbench complaint statues it showing empty .
for you reference pls see the attached screenshot .
Pls can you help me on this .
Regards,
Suresh. -
Does SSIS guarantee that it loads the data into SQL Server in the same order as it is in Excel
Hi,
We are trying to load several Excel files into SQL Server SSIS and we need to save the data in the database in the same order as it is in Excel..
My question is, Does SSIS guarantee that it loads the data into SQL Server in the same order as it is in Excel. If so, we will add a sequence to ensure we have the order.
Please advise.
Thanks & Regards,
DhanumjayThanks for your response.
If it is one file then we can add an index column, but we have to load hundreds of files.
How adding an index/key column to the table works unless SSIS picks and loads the data in the table in the same order as it is in Excel?
Just to explain my question better,
If excel has three rows {a,b},{c,d},{e,f}, does SSIS ensure it loads the data in the same order in a table.
Thanks. -
Hi Experts,
One of my data target (Data comming from another ODS) is taking long time for loading, basically it takes below 10 times only, but today it is running from last 40 minesu2026
In Status Tab it showing....
Job termination in source system
Diagnosis
The background job for data selection in the source system has been terminated. It is very likely that a short dump has been logged in the source system
Procedure
Read the job log in the source system. Additional information is displayed here.
To access the job log, use the monitor wizard (step-by-step analysis) or the menu path <LS>Environment -> Job Overview -> In Source System
Error correction:
Follow the instructions in the job log messages.
Can anyone please solve my problem.
Thanks in advance
DavidHi Experts,
Thanks for your answers, and my load failed when the data comes from one ODS to another ODS. find the below job log
Job started
Step 001 started (program SBIE0001, variant &0000000007169, user ID RCREMOTE)
Asynchronous transmission of info IDoc 2 in task 0001 (0 parallel tasks)
DATASOURCE = 8ZPP_OP3
Current Values for Selected Profile Parameters *
abap/heap_area_nondia......... 20006838008 *
abap/heap_area_total.......... 20006838008 *
abap/heaplimit................ 83886080 *
zcsa/installed_languages...... ED *
zcsa/system_language.......... E *
ztta/max_memreq_MB............ 2047 *
ztta/roll_area................ 5000000 *
ztta/roll_extension........... 4294967295 *
Call customer enhancement EXIT_SAPLRSAP_001 (CMOD) with 100,000 records
Result of customer enhancement: 100,000 records
Asynchronous send of data package 1 in task 0002 (1 parallel tasks)
tRFC: Data Package = 0, TID = , Duration = 00:00:00, ARFCSTATE =
tRFC: Start = 06.09.2010 01:31:55, End = 06.09.2010 01:31:55
Asynchronous transmission of info IDoc 3 in task 0003 (1 parallel tasks)
Call customer enhancement EXIT_SAPLRSAP_001 (CMOD) with 100,000 records
Result of customer enhancement: 100,000 records
Asynchronous send of data package 2 in task 0004 (2 parallel tasks)
tRFC: Data Package = 0, TID = , Duration = 00:00:00, ARFCSTATE =
tRFC: Start = 06.09.2010 01:32:00, End = 06.09.2010 01:32:00
Asynchronous transmission of info IDoc 4 in task 0005 (2 parallel tasks)
Call customer enhancement EXIT_SAPLRSAP_001 (CMOD) with 100,000 records
Result of customer enhancement: 100,000 records
Asynchronous send of data package 3 in task 0006 (3 parallel tasks)
tRFC: Data Package = 0, TID = , Duration = 00:00:00, ARFCSTATE =
tRFC: Start = 06.09.2010 01:32:04, End = 06.09.2010 01:32:04
Asynchronous transmission of info IDoc 5 in task 0007 (3 parallel tasks)
Call customer enhancement EXIT_SAPLRSAP_001 (CMOD) with 100,000 records
Result of customer enhancement: 100,000 records
Asynchronous send of data package 4 in task 0008 (4 parallel tasks)
tRFC: Data Package = 0, TID = , Duration = 00:00:00, ARFCSTATE =
tRFC: Start = 06.09.2010 01:32:08, End = 06.09.2010 01:32:08
Asynchronous transmission of info IDoc 6 in task 0009 (4 parallel tasks)
Call customer enhancement EXIT_SAPLRSAP_001 (CMOD) with 100,000 records
Result of customer enhancement: 100,000 records
Asynchronous send of data package 5 in task 0010 (5 parallel tasks)
tRFC: Data Package = 0, TID = , Duration = 00:00:00, ARFCSTATE =
tRFC: Start = 06.09.2010 01:32:11, End = 06.09.2010 01:32:11
Asynchronous transmission of info IDoc 7 in task 0011 (5 parallel tasks)
Call customer enhancement EXIT_SAPLRSAP_001 (CMOD) with 100,000 records
Asynchronous send of data package 13 in task 0026 (6 parallel tasks)
tRFC: Data Package = 0, TID = , Duration = 00:00:01, ARFCSTATE =
tRFC: Start = 06.09.2010 01:32:44, End = 06.09.2010 01:32:45
tRFC: Data Package = 8, TID = 0AEB465C00AE4C847CEA0070, Duration = 00:00:17,
tRFC: Start = 06.09.2010 01:32:29, End = 06.09.2010 01:32:46
Asynchronous transmission of info IDoc 15 in task 0027 (5 parallel tasks)
Call customer enhancement EXIT_SAPLRSAP_001 (CMOD) with 100,000 records
Result of customer enhancement: 100,000 records
Asynchronous send of data package 14 in task 0028 (6 parallel tasks)
tRFC: Data Package = 0, TID = , Duration = 00:00:00, ARFCSTATE =
tRFC: Start = 06.09.2010 01:32:48, End = 06.09.2010 01:32:48
tRFC: Data Package = 9, TID = 0AEB465C00AE4C847CEF0071, Duration = 00:00:18,
tRFC: Start = 06.09.2010 01:32:33, End = 06.09.2010 01:32:51
Asynchronous transmission of info IDoc 16 in task 0029 (5 parallel tasks)
Call customer enhancement EXIT_SAPLRSAP_001 (CMOD) with 100,000 records
Result of customer enhancement: 100,000 records
Asynchronous send of data package 15 in task 0030 (6 parallel tasks)
tRFC: Data Package = 0, TID = , Duration = 00:00:00, ARFCSTATE =
tRFC: Start = 06.09.2010 01:32:52, End = 06.09.2010 01:32:52
Asynchronous transmission of info IDoc 17 in task 0031 (6 parallel tasks)
Call customer enhancement EXIT_SAPLRSAP_001 (CMOD) with 100,000 records
Result of customer enhancement: 100,000 records
Asynchronous send of data package 16 in task 0032 (7 parallel tasks)
tRFC: Data Package = 10, TID = 0AEB465C00684C847CF30070, Duration = 00:00:18,
tRFC: Start = 06.09.2010 01:32:37, End = 06.09.2010 01:32:55
tRFC: Data Package = 11, TID = 0AEB465C02E14C847CF70083, Duration = 00:00:17,
tRFC: Start = 06.09.2010 01:32:42, End = 06.09.2010 01:32:59
tRFC: Data Package = 0, TID = , Duration = 00:00:00, ARFCSTATE =
tRFC: Start = 06.09.2010 01:32:56, End = 06.09.2010 01:32:56
Asynchronous transmission of info IDoc 18 in task 0033 (5 parallel tasks)
Call customer enhancement EXIT_SAPLRSAP_001 (CMOD) with 100,000 records
Result of customer enhancement: 100,000 records
Asynchronous send of data package 17 in task 0034 (6 parallel tasks)
tRFC: Data Package = 0, TID = , Duration = 00:00:00, ARFCSTATE =
tRFC: Start = 06.09.2010 01:33:00, End = 06.09.2010 01:33:00
tRFC: Data Package = 12, TID = 0AEB465C00AE4C847CFB0072, Duration = 00:00:16,
tRFC: Start = 06.09.2010 01:32:46, End = 06.09.2010 01:33:02
Asynchronous transmission of info IDoc 19 in task 0035 (5 parallel tasks)
Call customer enhancement EXIT_SAPLRSAP_001 (CMOD) with 100,000 records
Result of customer enhancement: 100,000 records
Asynchronous send of data package 18 in task 0036 (6 parallel tasks)
tRFC: Data Package = 0, TID = , Duration = 00:00:00, ARFCSTATE =
tRFC: Start = 06.09.2010 01:33:04, End = 06.09.2010 01:33:04
Asynchronous transmission of info IDoc 20 in task 0037 (6 parallel tasks)
ABAP/4 processor: DBIF_RSQL_SQL_ERROR
Job cancelled
Thanks
Daivd
Edited by: david Rathod on Sep 6, 2010 12:04 PM
Maybe you are looking for
-
Android tablets supported for Cisco Jabber for Android 10.5 ?
Exist a list with tablets support for this realease ? In release notes only include smartphones. Anoher question, is about version android client, in google play exits a old version, but not 10.5, any reason for this ? Thanks in advance.
-
I have an app that i want to get out of my cloud,can i do this ????? It has a glitch and wont work and when i come to re install, it comes from my cloud along with the glitch. Can anyone help ?
-
Time capsule or Airport Extreme + USB hard drive
I would like to get a wireless update solution (to work with time machine) and I am considering the following options: 1. Time Capsule 1TB 2. Time Capsule 500GB + USB external hard drive attached to the time capsule (in case I need more storage than
-
CAD is not supporting in windows 8
CAD is not supporting in windows 8. Has there no any solution to install CAD in windows 8?
-
Hello! I am having an issue creating a nice clean DVD. I have a series of JPG files that are 300dpi (2400x1080). Basically I import them into iPhoto to be used for iDVD (version 7.0.4). I then "insert slideshow" in iDVD and put a simple fade transiti