How to restrict no. of data packets from source system into BW
Hi,
when i try loading data from the IDES system to the PSA in BI 7.0. I get the following error from Data packet 41 onwards...
Data Package 41 : arrived in BW ; Processing : Selected number does not agree with transferred n
I have already read the various suggestions on the forum. Since I am loading data for testing purpose, i would like to restrict the data load until the 40th data packet. Is this possible & how?
Pls help. Thanks
SD
Hi,
I don't think there's a parameter you can set for the number of data packages.
There's a parameter for the size of the datapackage. But that's different.
Of course you can always try and restrict the data you load with the selection of your infopackage.
Thats a more sensible way of doing it rather than restricting the number of data packages.
Because even somehow if you managed to get them to 40, you won't know what data you will be missing that the datapackages from 41 onwards would have brought. And this will lead to reconcilliation issues with R/3 data in your testing.
-RMP
Similar Messages
-
How to restrict number of Data Records from Source system?
Hi,
How can I restrict the number of Data records from R3 source system that are being loaded into BI. For example I have 1000 source data records, but only wish to transfer the first 100. How can I achieve this? Is there some option in the DataSource definition or InfoPackage definition?
Pls help,
SDHi SD,
You can surely restrict the number of records, best and simplest way is, check which characteristics are present in selection screen of InfoPackage and check in R3, which characteristics if given a secection could fetch you the desired number of records. Use it as selection in InfoPackage.
Regards,
Pankaj -
How to schedule Job for data uploading from source to BI
Hi to all,
How to schedule Job for data uploading from source to BI,
Why we required and how we do it.
As I am fresher in BI, I need to know from bottom.
Regards
Pavneet RanaHi.
You can create [process chain |http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/502b2998-1017-2d10-1c8a-a57a35d52bc8?quicklink=index&overridelayout=true]for data loading pocess and schedule start process to any time/date etc ...
Regadrs. -
In SharePoint 2013 if you search for an exact phrase and then view a PDF that is returned by the search (in Reader via the PDF plugin) then the Reader will treat the complete phrase as individual words.
For example, search for "High Court" in SharePoint, get returned a set of PDF documents, and when you view one of them through the Reader the hit highlighting will be for each instance of "High" or "Court".
What is wanted is hit highlighting only of instances of "High Court".
I get the same behavior if the Adobe PDF plugin is installed with the Filesite DMS. So it seems to be standard behavior for the Adobe PDF plugin
Is there a way to make the Adobe PDF plugin carry exact phrase search criteria through from source system into Reader? And then hit highlight the exact phrase only.Hi Sam,
You can opt for 'Advance Search' option in Reader to match the whole words to get to the specific documents.
Regards,
Rave -
Data loading from source system takes long time.
Hi,
I am loading data from R/3 to BW. I am getting following message in the monitor.
Request still running
Diagnosis
No errors could be found. The current process has probably not finished yet.
System response
The ALE inbox of the SAP BW is identical to the ALE outbox of the source system
and/or
the maximum wait time for this request has not yet run out
and/or
the batch job in the source system has not yet ended.
Current status
in the source system
Is there anything wrong with partner profile maintanance in the source system.
Cheers
SenthilHi,
I will suggest you to check a few places where you can see the status
1) SM37 job log (In source system if load is from R/3 or in BW if its a datamart load) (give request name) and it should give you the details about the request. If its active make sure that the job log is getting updated at frequent intervals.
Also see if there is any 'sysfail' for any datapacket in SM37.
2) SM66 get the job details (server name PID etc from SM37) and see in SM66 if the job is running or not. (In source system if load is from R/3 or in BW if its a datamart load). See if its accessing/updating some tables or is not doing anything at all.
3) RSMO see what is available in details tab. It may be in update rules.
4) ST22 check if any short dump has occured.(In source system if load is from R/3 or in BW if its a datamart load)
5) SM58 and BD87 for pending tRFCs and IDOCS.
Once you identify you can rectify the error.
If all the records are in PSA you can pull it from the PSA to target. Else you may have to pull it again from source infoprovider.
If its running and if you are able to see it active in SM66 you can wait for some time to let it finish. You can also try SM50 / SM51 to see what is happening in the system level like reading/inserting tables etc.
If you feel its active and running you can verify by checking if the number of records has increased in the data tables.
SM21 - System log can also be helpful.
Thanks,
JituK -
Trigger Process Chain from Source System into Target System
Hello...actually in source system BZD, we have a process chain, in the target system BWD, we have another process chain as well. We want to have a process of combining these two process chain together. When the process chain in source system is completed successfully, the system will sent out some sort of signal to start the process chain in BWD. I am not sure how this can be done, I try to use the function module "BP_EVENT_RAISE" in the BZD and when this module is called, it does not start the process chain in BWD after the process chain in BZD is completed. I just wondering am I using the correct method. If will be good if anyone here can help...thanks alot.
>
FluffyToffee wrote:
> Hello...actually in source system BZD, we have a process chain, in the target system BWD, we have another process chain as well. We want to have a process of combining these two process chain together. When the process chain in source system is completed successfully, the system will sent out some sort of signal to start the process chain in BWD. I am not sure how this can be done, I try to use the function module "BP_EVENT_RAISE" in the BZD and when this module is called, it does not start the process chain in BWD after the process chain in BZD is completed. I just wondering am I using the correct method. If will be good if anyone here can help...thanks alot.
Hi Fluffy,
Check RFC connection between the two sysems.
Try using remote process chain process type in the process chain. the fllowing link may help you.
http://help.sap.com/saphelp_nw70/helpdata/EN/86/6ff03b166c8d66e10000000a11402f/frameset.htm
Regards,
sunmit. -
How to Restrict the consumer data flow from CRM to R/3
Hi Experts
I have a scenario where my client doesn't wish to flow consumers data into R/3 which are created in CRM's B2C shop.He wants to map consumer data with R/3 One time customers so that further process such as Delivery and all take place and this status gets updated in B2C shop via CRM
hence i request you all could you please help me to solve my issue
i would appreciate you help
Thanking you in advance
Regards
Rao.Hi Mark
thanks very much for you reply
See my scenario is i have set up the B2C SHOP in CRM not in ECC as and when consumer gets register in B2C SHOP he creates an Order, so standard functionality is that CRM Order needs to flow into R/3 along with consumer data such as (sold_to,ship_to_bill_to---) for further process like delivery and billing.
Now my requirement is client does not wish consumer data should be flown into R/3 for which he wants to map consumer data which is created in CRM to R/3 with One Time customer for delivery, so that when ever consumer gets created in B2C SHOP it should map with the same R/3 One time customers
which means that very time one time customer data should be replaced with consumer data which are created in B2C shop.and delivery should take place
so could you please help me is there any way to resolve my issue
Thanking you
Regards
Rao -
Master Data Extraction from source system
Hi BW Gurus,
I need to extract the "<b>Purchase Document Type</b>" and "<b>Purchase Document Status</b>" master data from R3.
Can anybody shed some light how can i find out which table in R3 host this data.
Thank you in advance and i promise to assign point for your answers
Thank youHi,
I feel the base table for Purchasing Document Type is T161.
For Status of purchasing document there is no base table .the avilable value are comming from Available Domain values.
With rgds,
Anil Kumar Sharma .P -
Understanding the data transfer from source system to PSA
hi experts,
i an new to BI. i have created a transactional data source for VBAK table and replicated that to BI. Created transformations and info package for uploading the data into PSA. i just want to know the sequence of steps that the system executes when we run the infopackage. i know that it makes a rfc call to source and brings the data back to BI. i want to where that RFC call happens and what is the RFC FM name etc.,. I tried to debug the process using /h, but as you know, it is very difficult to debug the whole standard code. it is very complex. i got lost some where in the middle. if anybody has any idea / or done any research on this ,please share your findings.
thanksHi,
Once you click on Start buton in Infopackage,
1. BW system wil send on erequest to ECC along with DataSource details.
2. Based on BW request ECC will respond and it will send message OK and starts extraction.
3. Based on InfoPackage selection the Data will executes in ECC and picks the data in Packets.
4. And ECC will send the data in teh form of Datapackets to BW (Idocs)
5. Once the selection creatiro is over the Job will finish.
6. You can see the data in BW
Thanks
Reddy -
Data Flow from Source systemside LUWS and Extarction strucures
Hi
Can Anybody Explain the Data flow from Source system to Bi System .Especially I mean the Extract Structure and LUWS where does they come in picture ,the core data flow of inbound and out bound queues .If any link for the document would also be helpful.
Regards
SantoshHi See Articles..
http://wiki.sdn.sap.com/wiki/display/profile/Surendra+Reddy
Data Flow from LBWQ/SMQ1 to RSA7 in ECC (Records Comparison).
http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/enterprise-data-warehousing/data%20flow%20from%20lbwq%20smq1%20to%20rsa7%20in%20ecc%20(Records%20Comparison).pdf
Checking the Data using Extractor Checker (RSA3) in ECC Delta Repeat Delta etc...
http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/80f4c455-1dc2-2c10-f187-d264838f21b5&overridelayout=true
Data Flow from LBWQ/SMQ1 to RSA7 in ECC and Delta Extraction in BI
http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/business-intelligence/d-f/data%20flow%20from%20lbwq_smq1%20to%20rsa7%20in%20ecc%20and%20delta%20extraction%20in%20bi.pdf
Thanks
Reddy -
Compare dates coming form Source system & update higher one in target syste
hi all,
Mt reqt is to compare dates coming from source system & update highere one
Ex--) E1EDP01 has been repeated 3 times in Idoc segemnt E1EDP20 , then date with higher one need to be updated in target system
like 14/12/08
15/12/08
16/12/08
Here 16/12/08 need to be updated first & then the other ones one by one.
Anybody guide me in comparing these dates functionality .
Send me the code !
Regards
ChaithanyaHi Michael,
I was unable to trace the exact issue of how to track E1EDP01 Dates & compare it.
I had created UDF to compare dates coming from E1EDP01 Segment
following is my code --->
// This UDF return 1 value for highest date and 0 for not highest date.
DateFormat mydateformat = new SimpleDateFormat("ddMMyyyy");
Date mydate1 = null;
Date heightDt = null;
for (int i=0; i <a.length; i++)
if (a<i>.equals(ResultList.CC)) continue;
try{
mydate1 = mydateformat.parse(a<i>);
if (i==0)
heightDt = mydateformat.parse(a<i>);
if (heightDt .before(mydate1))
heightDt = mydate1;
catch(Exception e){}
//result.addValue( mydateformat.format(heightDt));
for (int i=0; i <a.length; i++)
try{
mydate1 = mydateformat.parse(a<i>);
if (heightDt .equals(mydate1))
result.addValue( "1");
else
result.addValue( "0");
catch(Exception e){}
My Problem here is in my Idoc if there are Multiple E1EDP20 segments & here date would be repeated 4 times . I have to compare highest date out of 4 & send it to the target system.
which I have done it, but here my problem is the segment E1EDP01 is repeated & iam unable to find highest date individually to the node level.
Sorry I should have explained this before only.
Can Anyone guide me in comparing the date at segment level.
Regards
Chaithanya -
DataSource extraction very slow ( from Source System to PSA it takes 23 hrs
Friends,
We have enhanced the datasource 0CRM_SALES_ORDER_I with the user exit....after the enhancement i.e (adding the new fields and wrote some coding to enhance ...) the data extraction takes place for around 23 hours. there is approximately 2,50,000 records.
Can you please suggest any steps to tune up the performance of the datasource.
NOTE: Data Extraction from source system to PSA alone takes 23 hrs.once the data is arrived in PSA then the loading of data to cube is fast.
PLZ help me to solve this issue.
BASKARHi Friends,
This is the code used for the datasource enhancement.(EXIT_SAPLRSAP_001)
DATA : IS_CRMT_BW_SALES_ORDER_I LIKE CRMT_BW_SALES_ORDER_I.
DATA: MKT_ATTR TYPE STANDARD TABLE OF CRMT_BW_SALES_ORDER_I.
DATA: L_TABIX TYPE I.
DATA: LT_LINK TYPE STANDARD TABLE OF CRMD_LINK,
LS_LINK TYPE CRMD_LINK.
DATA: LT_PARTNER TYPE STANDARD TABLE OF CRMD_PARTNER,
LS_PARTNER TYPE CRMD_PARTNER.
DATA: LT_BUT000 TYPE STANDARD TABLE OF BUT000,
LS_BUT000 TYPE BUT000.
DATA: GUID TYPE CRMT_OBJECT_GUID.
DATA: GUID1 TYPE CRMT_OBJECT_GUID_TAB.
DATA: ET_PARTNER TYPE CRMT_PARTNER_EXTERNAL_WRKT,
ES_PARTNER TYPE CRMT_PARTNER_EXTERNAL_WRK.
TYPES: BEGIN OF M_BINARY,
OBJGUID_A_SEL TYPE CRMT_OBJECT_GUID,
END OF M_BINARY.
DATA: IT_BINARY TYPE STANDARD TABLE OF M_BINARY,
WA_BINARY TYPE M_BINARY.
TYPES : BEGIN OF M_COUPON,
OFRCODE TYPE CRM_MKTPL_OFRCODE,
END OF M_COUPON.
DATA: IT_COUPON TYPE STANDARD TABLE OF M_COUPON,
WA_COUPON TYPE M_COUPON.
DATA: CAMPAIGN_ID TYPE CGPL_EXTID.
TYPES : BEGIN OF M_ITEM,
GUID TYPE CRMT_OBJECT_GUID,
END OF M_ITEM.
DATA: IT_ITEM TYPE STANDARD TABLE OF M_ITEM,
WA_ITEM TYPE M_ITEM.
TYPES : BEGIN OF M_PRICE,
KSCHL TYPE PRCT_COND_TYPE,
KWERT TYPE PRCT_COND_VALUE,
KBETR TYPE PRCT_COND_RATE,
END OF M_PRICE.
DATA: IT_PRICE TYPE STANDARD TABLE OF M_PRICE,
WA_PRICE TYPE M_PRICE.
DATA: PRODUCT_GUID TYPE COMT_PRODUCT_GUID.
TYPES : BEGIN OF M_FRAGMENT,
PRODUCT_GUID TYPE COMT_PRODUCT_GUID,
FRAGMENT_GUID TYPE COMT_FRG_GUID,
FRAGMENT_TYPE TYPE COMT_FRGTYPE_GUID,
END OF M_FRAGMENT.
DATA: IT_FRAGMENT TYPE STANDARD TABLE OF M_FRAGMENT,
WA_FRAGMENT TYPE M_FRAGMENT.
TYPES : BEGIN OF M_UCORD,
PRODUCT_GUID TYPE COMT_PRODUCT_GUID,
FRAGMENT_TYPE TYPE COMT_FRGTYPE_GUID,
ZZ0010 TYPE Z1YEARPLAN,
ZZ0011 TYPE Z6YAERPLAN_1,
ZZ0012 TYPE Z11YEARPLAN,
ZZ0013 TYPE Z16YEARPLAN,
ZZ0014 TYPE Z21YEARPLAN,
END OF M_UCORD.
DATA: IT_UCORD TYPE STANDARD TABLE OF M_UCORD,
WA_UCORD TYPE M_UCORD.
DATA: IT_CATEGORY TYPE STANDARD TABLE OF COMM_PRPRDCATR,
WA_CATEGORY TYPE COMM_PRPRDCATR.
DATA: IT_CATEGORY_MASTER TYPE STANDARD TABLE OF ZPROD_CATEGORY ,
WA_CATEGORY_MASTER TYPE ZPROD_CATEGORY .
types : begin of st_final,
OBJGUID_B_SEL TYPE CRMT_OBJECT_GUID,
OFRCODE TYPE CRM_MKTPL_OFRCODE,
PRODJ_ID TYPE CGPL_GUID16,
OBJGUID_A_SEL type CRMT_OBJECT_GUID,
end of st_final.
data : t_final1 type standard table of st_final.
data : w_final1 type st_final.
SELECT bOBJGUID_B_SEL aOFRCODE aPROJECT_GUID bOBJGUID_A_SEL INTO table t_final1 FROM
CRMD_MKTPL_COUP as a inner join CRMD_BRELVONAE as b on bOBJGUID_A_SEL = aPROJECT_GUID . -
How to Restrict Single Delivery Date for PO with Multiple Line Items
Dear Experts,
How to Restrict Single Delivery Date for PO with Multiple Line Items.
System needs to through Error Message if User Inputs Different Delivery Dates for PO with Multiple Line Items in ME21N Tcode.
Can we achive this by Some Enhancement in SAP or Not ???
If so how to do it.
Any Inputs is highly appreciated.
Thanks and Regards,
Selvakumar. MHi Selvakumar,
we can resrict the PO to have a single delivery date in all the line items by means of giving a error message or overwiting the delivery date keyed/determined in the line item.
You can use the BADI -> ME_PROCESS_PO_CUST. In which you need to implement the method PROCESS_SCHEDULE.
(for technical aid - This method will be called for each and every PO line item, From the imporing parameter im_schedule we can get all the details of current PO line, even we can change the data in the current PO line. )
Regards,
Madhu. -
How to skip an entire data packet while data loading
Hi All,
We want to skip some records based on a condition while loading from PSA to the Cube, for which we have written a ABAP code in Start Routine .
This is working fine.
But there is a Data packet where all the records are supposed to be skipped and here it is giving Dump and Exception CX_RSROUT_SKIP_RECORD.
The ABAP Code written is
DELETE SOURCE_PACKAGE WHERE FIELD = 'ABC' .
And for a particular data packet all the records satisfy the condition and gets deleted.
Please advice how to skip the entire data packet if all the reocrs satisfy the condition to be deleted and handle the exception CX_RSROUT_SKIP_RECORD .
Edited by: Rahir on Mar 26, 2009 3:26 PM
Edited by: Rahir on Mar 26, 2009 3:40 PMHi All,
The Dump I am getting is :
The exception 'CX_RSROUT_SKIP_RECORD' was raised, but it was not caught
anywhere along
the call hierarchy.
Since exceptions represent error situations and this error was not
adequately responded to, the running ABAP program 'GPD4PXLIP83MFQ273A2M8HU4ULN'
has to be terminated.
But this comes only when all the records in a particular Data Packet gets skipped.
For rest of the Data Packets it works fine.
I think if the Data Packet(with 0 records) itself can be skipped this will be resolved or the Exception will be taken care of.
Please advice how to resolve this and avoid 'CX_RSROUT_SKIP_RECORD' at earliest .
Edited by: Rahir on Mar 27, 2009 6:25 AM
Edited by: Rahir on Mar 27, 2009 7:34 AM -
How to store the data coming from network analyser into a text or excel file
Hii everyone
I'm using Agilent 8719ET network analyser and wish to store the data coming from netowrk analyser into a text file/ excel file.
Presently I'm able to get the data on Labview graph using GPIB . Can anyone suggest how to go ahead after collect data sub vi. How can the data be stored into a file apart from showing on the graph?
Attached is the vi for kind consideration...
Looking for help
Regards
Rohit
Attachments:
Agilent 87XX Series Exceed Max Meas.vi 43 KBFirst let me say that your code really looks pretty good. The data handling could be made more efficient by calculating the number of datapoints that are going to be in the completed dataset and preallocating the entire array -- but depending upon your answer to my questions, the logic in the lower shift register may be going away - so we won't worry about that right now.
The thing I need to know before addressing the data storage question is: Each time you call "Collect and Display Data.vi", how many element are in the array? Are you reading single data points, or a group of data? (BTW: if the answer to that question is obvious based on the way the other VIs are setup, I don't have the drivers so I can't tell what the setup values are.) Second, how fast does the loop iterate? Are we talking msec per loop?, seconds? fortnights?
The issues here are two-fold: how much data? and how fast is it coming? The answer to these will tell you how to save the data.
Mike...
Certified Professional Instructor
Certified LabVIEW Architect
LabVIEW Champion
"... after all, He's not a tame lion..."
Be thinking ahead and mark your dance card for NI Week 2015 now: TS 6139 - Object Oriented First Steps
Maybe you are looking for
-
Need some help with all of the jumper settings.
Wow, this has been some ordeal. I inherited a G4 iMac from a friend, which has been a great deal of trouble to me, particularly with the jumper settings. Anybody like dealing with that? Come on down... So here's my situation. The hard drive in the iM
-
Hi all, I have a requirement to copy partner information from a Higher level I-Object in Base to a Child I-Object. I'm using this FM - CRM_ICSS_GET_COMPONENT_DETAIL to get the IObject details including partners and then using FM - CRM_ICSS_CHANGE_COM
-
How transfer asset master balances
how transfer asset master balances. please explain details process.
-
Render and Replace Different Format In Same Sequence
Is there any way to keep the Interpret Footage settings after make Render and Replace option? I have a sequence setting in 1080 50p with two different formats 50p 720 (interpret in 25p) / 1080 25p, I did a render and replace of the total. And the re
-
Why do I have all Adobe CC programs in 2 versions ...CC
...can I uninstall the CC-versions or what?