Huge volume of data not getting processed
Hello Everyone,
Its a single file to multiple idoc scenario. There is no mapping involved. But the problem is that the file is of size 50 MB having 50000 idocs . This idocs get divided and are sent to BW for reporting based on the message id . Now since the message id should remain same for all the 50000 idocs, so i cannot split the file.
But when i process this, it gives me Lock_Table_Overflow error . The function module IDOC_INBOUND_ASYNCHRONOUS is in error in SM58 . I have checked the enque/table_size and its 64000. i think its enough to process a 50 MB file.
Please let me know how to proceed further with this.
Regards,
Ravi
Hi Ravi,
I don't really think this is a problem of PI itself, especially that you get the error in IDOC_INBOUND_ASYNCHRONOUS. The enque/table_size equal 64 000 might not be enough for 50 000 IDocs - just think if each IDoc requires two locks.
Hopefully, you should be able solve the issue by setting the Queue Processing checkbox in your receiver IDoc adapter in PI. This will force IDocs being processed one by one, so that so many locks will not be created simultaneously. The only problem is that I cannot foresee, how big the overall increase in processing time will be.
But you will not know until you try and please let us know about the results, as there might be others to follow your path
Hope this helps,
Greg
Similar Messages
-
Data packet not getting processed
Hi SDN's
I m loading data from one ODS to 4 regions , the source ODS is successfully loaded from der to the data targets the load is getting failed or taking loang time .
upto transfer rules the data is successful, in update rules data packets are not getting processed
kindly suggest solution, points will be assigned
thx in advanceHi Katam,
In the target ODSs go to the monitor screen for a particular request -> in the menu bar go to environment -> transactRFC-> in the datawarehouse -> give the ID and date -> execute.
Check if there are entries in that. Usually this queue will be stuck and you need to execute the LUWs in the queue manually.
if it says transaction recorded u need to execute it manually
Please revert if any issues
Edited by: Pramod Manjunath on Dec 19, 2007 4:48 PM -
Error while extracting huge volumes of data from BW
Hi,
we see this error while extracting huge volumes of data (apprx 3.4 million and with more no.of columns) and we see this error.
R3C-151001: |Dataflow DF_SAPSI_SAPSI3131_SAPBW_To_Teradata
Error calling R/3 to get table data: <RFC Error:
Key: TSV_TNEW_PAGE_ALLOC_FAILED
Status: EXCEPTION SYSTEM_FAILURE RAISED
No more storage space available for extending an internal table.
>.
We are not sure if DoP works with source as SAP BW, but when tried with DoP also, we got the same error.
Will this issue be resolved with an R/3 or ABAP dataflow? Can anyone suggest some possible solutions for this scenario?
SriThe problem is that you've reached the maximum memory configure for your system.
If this is batch job reconfigure the profile parameter
abap/heap_area_nondia
Markus -
In Bdc I have huge volume of data to upload for the given transaction
Hi gurus,
In Bdc I have huge volume of data to upload for the given transaction, here am using session method, it takes lots of exection time to complete the whole transaction, Is there any other method to process the huge volume with minimum time,
reward awaiting
with regards
ThambeSelection of BDC Method depends on the type of the requirement you have. But you can decide which one will suite requirement basing the difference between the two methods. The following are the differences between Session & Call Transaction.
Session method.
1) synchronous processing.
2) can tranfer large amount of data.
3) processing is slower.
4) error log is created
5) data is not updated until session is processed.
Call transaction.
1) asynchronous processing
2) can transfer small amount of data
3) processing is faster.
4) errors need to be handled explicitly
5) data is updated automatically
Batch Data Communication (BDC) is the oldest batch interfacing technique that SAP provided since the early versions of R/3. BDC is not a typical integration tool, in the sense that, it can be only be used for uploading data into R/3 and so it is
not bi-directional.
BDC works on the principle of simulating user input for transactional screen, via an ABAP program.
Typically the input comes in the form of a flat file. The ABAP program reads this file and formats the input data screen by screen into an internal table (BDCDATA). The transaction is then started using this internal table as the input and executed in the background.
In Call Transaction, the transactions are triggered at the time of processing itself and so the ABAP program must do the error handling. It can also be used for real-time interfaces and custom error handling & logging features. Whereas in
Batch Input Sessions, the ABAP program creates a session with all the transactional data, and this session can be viewed, scheduled and processed (using Transaction SM35) at a later time. The latter technique has a built-in error processing mechanism too.
Batch Input (BI) programs still use the classical BDC approach but doesnt require an ABAP program to be written to format the BDCDATA. The user has to format the data using predefined structures and store it in a flat file. The BI program then reads this and invokes the transaction mentioned in the header record of the file.
Direct Input (DI) programs work exactly similar to BI programs. But the only difference is, instead of processing screens they validate fields and directly load the data into tables using standard function modules. For this reason, DI programs are much faster (RMDATIND - Material Master DI program works at least 5 times faster) than the BDC counterpart and so ideally suited for loading large volume data. DI programs are not available for all application areas.
synchronous & Asynchronous updating:
http://www.icesoft.com/developer_guides/icefaces/htmlguide/devguide/keyConcepts4.html
synchronous & Asynchronous processings
Asynchronous refers to processes that do not depend on each other's outcome, and can therefore occur on different threads simultaneously. The opposite is synchronous. Synchronous processes wait for one to complete before the next begins. For those Group Policy settings for which both types of processes are available as options, you choose between the faster asynchronous or the safer, more predictable synchronous processing.
By default, the processing of Group Policy is synchronous. Computer policy is completed before the CTRLALTDEL dialog box is presented, and user policy is completed before the shell is active and available for the user to interact with it.
Note
You can change this default behavior by using a policy setting for each so that processing is asynchronous. This is not recommended unless there are compelling performance reasons. To provide the most reliable operation, leave the processing as synchronous. -
IDOCS Error - Not getting processed automatically
Hi All,
We are loading hierarchy for a Product from R/3 system using the standard datasource.
When we execute the info package, IDOCs are not getting processed automatically.
We are facing the below error message.
Error when updating Idocs in Business Information Warehouse
Diagnosis
Errors have been reported in Business Information Warehouse during IDoc update:
No status record was passed to ALE by the applicat
System Response
Some IDocs have error status.
Procedure
Check the IDocs in Business Information Warehouse . You do this using the extraction monitor.
Error handling:
How you resolve the errors depends on the error message you get.
But when we goto BD87 and process the IDOCs manually, these are getting posted and the hierarchy is loading.
Can someone please guide me on what is the issue with the IDOCs and how to make them to post automatically.
Thanks in Advance
Regards,
SachinHi,
This will happen due to Non-updated IDOCu2019s in the Source system i.e., occurs whenever LUWu2019s are not transferred from the source system to the destination system. If you look at RSMO of the status tab, the error message would appear like u201CtRFC Error in Source Systemu201D or u201CtRFC Error in Data Warehouseu201D or simply u201CtRFC Erroru201D depending on the system from where data is being extracted. Sometimes IDOC are also stuck on R/3 side as there were no processors available to process them. The solution for this Execute LUWu2019s manually. Go to the menu Environment -> Transact. RFC -> In the Source System from RSMO which will asks to login into the source system. The u201CStatus Textu201D for stucked LUWu2019s may be Transaction Recorded or Transaction waiting. Once you encounter this type of status Execute LUWu2019s manually using u201CF6u201D or Editexecute LUWu2019s(F6).Just keep on refreshing until you get the status u201CTransaction is executingu201D in the Production system. We can even see the stuck IDOCu2019c in Transaction BD87 also.Now the data will be pulled into the BW.
Hope it helps a lot.
Thanks and Regards,
Kamesham -
IDOC status 64 not getting processed.
Hi Gurus,
I have a problem where IDOC's with status 64 are not getting processed via background job. the IDOCs weere getting processed with the background job till today evening, but suddenly no changes were made and the processing of IDOC's with status 64 has stopped and they keep filling up, when i checked the log of the Background job it says finished, and the log has the following information,
17.09.2009 19:05:23 Job started 00 516 S
17.09.2009 19:05:23 Step 001 started (program RBDAPP01, variant POS_IDOCS, user ID SAPAUDIT) 00 550 S
17.09.2009 19:05:25 No data could be selected. B1 083 I
17.09.2009 19:05:25 Job finished 00 517 S
Kindly advise on this.
Regards,
Riaz.When i process the IDOC's using BD87, they gets processed wiwth out aany issues no dump is displayed, idoc gets a status 53 after processing via BD87.
There is another problem how ever there are other jobs scheduled to run in the background other than this background job which are getting cancelled and have two different error logs for different jobs they are as follows.
1) 18.09.2009 02:00:06 Job started 00 516 S
18.09.2009 02:00:06 Logon not possible (error in license check) 00 179 E
18.09.2009 02:00:06 Job cancelled after system exception ERROR_MESSAGE 00 564 A
2) 18.09.2009 00:55:27 Job started 00 516 S
18.09.2009 00:55:27 Step 001 started (program RSCOLL00, variant , user ID JGERARDO) 00 550 S
18.09.2009 00:55:29 Clean_Plan:Gestartet von RSDBPREV DB6PM 000 S
18.09.2009 00:55:29 Clean_Plan:beendet DB6PM 000 S
18.09.2009 01:00:52 ABAP/4 processor: DBIF_RSQL_SQL_ERROR 00 671 A
18.09.2009 01:00:52 Job cancelled 00 518 A
This is a production server and the Licence is valid till 31.12.9999, and has no issues with the license. -
Data with huge volume of data with DTP
Hi Experts,
I have this problem with upload of huge volume of data with DTPs.
I have my initialisation done as I am doing reloads, Now I have this data from fiscal year period 000.2010 to 016.9999.
I have huge volume of data.
I have tried uploading this data in chunks by dividing 3 months for each DTP and had made full load.
But when I processed the DTP the data packages are decided at source and I have about 2000 data packages.
Now my request is turning to red after processing about 1000 datapackages, batch processes allocated to this also stopped.
I have tried dividing DTP only by month and processed the DTP I have same problem. I have deleted the indexes before uplaoding to the cube, Changed the setting battch processing from 3 to 5.
Please can any one advise what could be problem.I am uplaoding this reloads in quality system.
How can upload this data which are in millions.
Thanks,
TatiHi Galban,
I have made the parallel processing from 3 to 5 even and the datapakcage size
Can you please advise in this area how can I increase the data package size as the data package size for my upload is the package size corresponds to package size in source it is determined dynammically at runtime.
Please advise.
Thanks
Tati -
In VA01 In Schedule line Delivery date not getting populated
Hello,
We have developed enhancement for VA01 tcode.
Depending upon the quatity entered for a material it should show one more item as free good.
If quatity is 10 for a line item 10 then it should show item 20 by default same mateerial and quatilty as 1.We have done this it is working fine.
But if we select free good item in our case it is 20 and click on schedule line item buttom all quaitity fields getting populated.but delivery date not getting populated for only free good.for line item 10 it showing properly.
Can any one please suggest what i need to do in order to display delivery date for free good.
Thanks in advance.
Regards.Thanks for your response.
Actually i have seen technical settings for that field.it is rv45a-etdat.this field has value till MV45AFZZ.seen in debugger.
Looks like it is clearing some where after the above include.Any suggisitions welcome.Thanks again. -
Data not getting populated in Payslip in ESS Portal
Hi All
I am tryig to display Payslip in Portal. Have done all the necessary configuration in Benefits and payments->Salary statement->HRFOR/ EDTIN features.
Correct Payslip form is visible but data is not getting populated in the payslip.
Have tested the Payslip in PC00_M40_CEDT transaction with the variant i have set for HRFOR/EDTIN features and Payslip data is displayed correctly.
Have checked for PZ11_PDF transaction but i get a message saying it cannot be accessed through Easy access.
Can anyone pls let me know what might be the reason for data not getting populated in Payslip in Portal?
what is role of PZ11_PDF transaction in Payslip display in Portal?
Regards
AshaHello,
Do one thing for executing the PZ11_PDF trsaction please follow following steps.
1. Once you log in SAP system with same User - Id which you r using on Portal .
Once log in PUT "/N" in the command box . Then put the trasaction "PZ11_PDF" and execute it will
Call the salary statement .
Or
Once you log in SAP system put the trasction "/nsbwp" then give the trasaction "PZ11_PDF" it will
call the salary statement ..
give inputs once you done
.....The issue with Authorisations please check it ...
Add this object in ESS role "S_SERVICE' ...
and this object in ESS role "P_PERNR" ---infotype 0008
Edited by: Vivek D Jadhav on Jun 15, 2009 11:49 AM -
Data not getting populated in ESS Payslip in portal
Hi All
I am tryig to display Payslip in Portal. Have done all the necessary configuration in Benefits and payments->Salary statement->HRFOR/ EDTIN features.
Correct Payslip form is visible but data is not getting populated in the payslip.
Have tested the Payslip in PC00_M40_CEDT transaction with the variant i have set for HRFOR/EDTIN features and Payslip data is displayed correctly.
Have checked for PZ11_PDF transaction but i get a message saying it cannot be accessed through Easy access.
Can anyone pls let me know what might be the reason for data not getting populated in Payslip in Portal?
what is role of PZ11_PDF transaction in Payslip display in Portal?
Regards
AshaAsha,
Maintain Feature EDPDF which determines the SMARTFORM being used to make the payslip available for employees. This is more of a HR related issue and I believe if you post this in the ESS or HR Forum you would be able to resolve this issue.
Good Luck!
Sandeep Tudumu -
Data not getting fetched from Quotation to Contract
Hi,Myself new to TM. I have a ticket mentioning data not getting fetched from quotation to contract. How to solve it? Please help...
Message was edited by: Michael ApplebyPlease add the version of the TM product and which SPs have been installed. Also more information on the quotation and contract.
What do you mean by ticket? It you mean an error message, from where are you reading the message?
Regards, Mike
SAP Customer Experience Group - CEG -
Contact person Rel.ship Data not getting updated in B2B Web User Mngt
Hi CRM Gurus,
Need some help on Web User Management functionality.
Sub: Contact person Relationship data not getting updated when we change the company (to wich contact person belong to) in ISA CRM 5.0 Web User Management.
we are currently on CRM ISA 5.0 and using Web User Managment for our B2B scenario. New creation of users is working fine. But when we want to change the company (Sold to pary) for the existing contact person, the relationship data in CRM is not getting updated and the below are the details.
Contact person No: XXXX (has a Relationship: "Is contac person for YYYY company in CRM)
Company/Sold to Party: YYYY (has a relationship "Has contact person XXXX in CRM).
When I chage the contact person's (XXXX) company from YYYY to ZZZZ,
- Relationships of the new assignment for ZZZZ in CRM not getting updated.
- Old Records in YYYY is not getting deleted (i.e. relationships.
- There is No relationship data appear in XXXX.
Appreciate any inputs on the same.
Thanks,
Rahul >>>Hi Rahul,
I'd suggest you running a session trace / ABAP debugging to see if some information is not getting passed from the Java stack onto the ABAP stack. An alternate move would be to create a new OSS customer message.
Cheers,
Ashok. -
Adhoc Query data not getting displayed on Portal
Hi,
I have a problem with custom Adhoc query data not getting displayed on the portal.
It was getting displayed initially but after a user made some changes to the query its not getting displayed.
The query is displaying data perfectly on R/3 but on portal ..Its giving the message no data found.
Can anyone help me on this.
Also if anyone can tell me how do i debug an Adhoc query from potal.
Is there any tool to debug an Abap program from portal.
I dont want to use trace,
Thanks
GT
Message was edited by: GTHi GT
find out the EXACT query u want to launch if it's display
in BW buisness explorer then change the iview property
for that query in portal . right click on iview
BEx Web Application Query String -> assign correct query
regards,
kaushal -
Article IDoc not getting processed in PI for particle article type
Hello All,
I have an strange behaviour in my system.
i.e 1, we are creating an article IDOC by using BD10 for particulat article type.
2, IDOC created succesfully , then
3, Process the IDOC in BD87 to push the IDOC to PI system .
4, But i could not see the message in PI that too only for one article type.
I'm wondering why this is not getting processed only for one article type and not for others.
we also checked the basic filter for the message type in dist model and even i could not see any condition in PI also,
Is there any thing else i should check? can you please help us on this.
Regards , Sethu.Hi,
Try to reimport the metadata of IDOC into PI in idx2. Also check the port definition in ECC and whether the segment is released or not. Please go through the below discussions it may help you.
EDISDEF: Port XXX segment defn E2EDK35 in IDoc type ORDERS05 CIM type
IDoc 0000000000181226 was saved but cannot or should not be sent
Regards,
Priyanka -
Automatic update of number ranges in FBN1 is not getting processed
Automatic update of number ranges in FBN1 (current status) level is not getting processed.
Tried posting a customer payment document in the system though number ranges are maintained for the year 2015 the number ranges are not overlapped last year all documents are closed but still the numbers are not been picked automatically in FBN1 kindly help in this issueHello Rajesh
Please verify if number ranges are maintained correctly, verify same using transaction OBA7. Also ensure that external assignment is unchecked in FBN1. Share screenshots if possible.
Regards
Aniket
Maybe you are looking for
-
HT1918 getting my debit card to work on my itunes
my itunes wont take my debit card
-
I am gettin an ERROR 46 will not load Activex Control cannot be found... can anyone hel me this why Quicktime will not load.
-
hi folks, in which table we can find the Reconciliation Account list for AP/AR? thanks in advance
-
i broke my iphone 4 so now i need my contacts but i cant open my apple id from my laptop please help me how to get my apple id open on my computer ??
-
how do i add attribute to a MimeMessage object's content-type? when i create a MimeMessage I always get a content-type which looks similar below, Content-Type: multipart/related; boundary="----=_Part_1_2704014.1032764882171" now i want to add a