AR Historical Data Upload
Gurus,
A real quick question:
A new implementation, ECC 6.0, struggling with the historical data upload for AR. How? For me to upload history and link it to the customer, i need to upload it thru the Customer and hence the GL account (reconciliation account) and the GL account needs to be configured as a Non-Recon. account. But, when i create the customer and try to link it with a non-recon. account, it is not letting me save the master data.
Please advice on the AR historical data upload urgently. thank you
Achhar.
Why would you need to have a non-rec account assigned in the customer?
The normal posting would be, depending on your items:
DR Customer *
CR Load Account (whatever the name is)
The customer would update the assigned recon account.
Regards
Hein
Similar Messages
-
Upload Sales Historical Data in R/3?
Hi Guys:
We are planning to use forecased based planning MRP Procedure ( MRP Type VV ) for Material Planning , we need historical sales data to execute the forecast.
can anybody tell me step by step how to upload 12 months historical data into SAP R/3, so that we can excute forecast based planning for a material?
Thanks
Sweth
Edited by: Csaba Szommer on May 12, 2011 9:57 AMi need to to pull data in bw from r/3 for frieght rates
example for rate is from source to dest(in rows) and on the basis of this we have counditon types(in columns) which we will put in coloums and on the base of those cound type we will get cummulated value of master data rate.
like
source dest rate(based on coundition types)
now in r/3 some coundition type are changed to new ones or we can say merged to new ones, now my question for new one i can pull no problem but what i'll do for the historical master data like coundition type in 2006 are diff and 2007 is diff
but they mapped in each othe
thx
rubane -
Upload PR/PO Historical data
Hi
How to upload the SAP PR/PO historical data from one system to another SAP system
I have all relevant data table down loads from old system. Now I need to upload this PR/PO historical data into New SAP system
there is no link between Old SAP system and New SAP system, both are owned by different orgnizations/owners
Immediate replies will be appreciated
Thanks & Regards
RajuHi Raju,
Loading of PO's with transactions is problematic, since the follow-on documents can't be loaded. If it is completely closed PO and you load it to the new system, you shall block it - otherwise it will be considered in SAP.
If you load a PO with GR > IR, the GR's can't be loaded, which means the invoice posted against this PO will remain unbalanced. You can of course close the PO by using MR11, but I think it is much easier just to skip this PO and register the coming invoice directly in FI.
If you load a PO with IR > GR, the IR's can be loaded, which means that the coming GR will make the PO remain unbalanced. So here I would suggest skipping this order too and registering the coming GR without reference to the PO (mvt 501).
In all my projects we only migrated completely "naked" PO's and besides, tried to minimise the number of such orders - let the buyers purchase more and complete the orders before the cut-over date - they can do it. In such case the number of open orders will normally be that low that it won't justify automatic migration. On the other hand, enetering the orders in the new system manually will be a good training for the buyers.
If the reason for loading the closed PO's is having purchasing statisics in the new system, then this can be done very easily by updating the corresponding LIS tables directly (S011, S012). Another tip is to load the data both to the version 000 (actual data) and to some other version in these tables, so that you will always be able to copy them to 000 again whenever you need to reload LIS for some reason.
BR
Raf
Edited by: Rafael Zaragatzky on Jun 3, 2011 8:28 PM -
Upload of historical data from legacy system
Dear forum
We are running standard SOP with transaction MC88 in ECC6.0.
We are doing forecasting on material/plant level, based on historical data (we create the sales plan from transaction MC88).
Now, the problem is that we are introducing new products into SOP, but there is no historical data in SAP for those materials.
Is there any way we can import historical data into SAP from legacy system, so that SOP would take this data into account when calculating the forecast?
Note: we do not want to build anything in flexible planning. Just want to check whether or not it is possible to import historical values from a legacy system, to use as historical data in SAP.
Thanks in advance
LarsThanks,
But that will not help us.
Any ohter suggestions out there? -
How to fill a new single field in a Infocube with historical data
Hello Everybody,
We have an SD infocube with historical data since 1997.
Some of the infoobjects (fields) of the infocube were empty during all this time.
Now we require to fill a single field of the infocube with historical data from R/3.
We were thinking that an option could be to upload data from the PSA in order to fill only the required field (infoobject).
Is it possible? is there any problem doing an upload from the PSA requests directly to the infocube.
Some people of our team are thinking that the data may be duplicated... are they right?
Which other solutions can we adopt to solve this issue?
We will appreciate all your valuable help.
Thanks in advance.
Regards.
Julio Cordero.Remodeling in BI 7:
/people/mallikarjuna.reddy7/blog/2007/02/06/remodeling-in-nw-bi-2004s
http://www.bridgeport.edu/sed/projects/cs597/Fall_2003/vijaykse/step_by_step.htm
Hope it helps.. -
Reloading Historical Data into Cube: 0IC_C03...
Dear All,
I'm working on SAP BW 7.3 and I have five years of historical data being loaded into InfoCube: 0IC_C03 (Material Stocks/Movements (as of 3.0B)) and the deltas are also scheduled (DataSources: 2LIS_03_BF & 2LIS_03_UM) for the InfoCube. I have new business requirement to reload the entire historical data into InfoCube with addition of "Fiscal Quarter" and I do not have a DSO in the data flow. I know its a bit tricky InfoCube (Inventory Management) to work on and reloading the entire historical data can be challenging.
1. How to approach this task, what steps should I take?
2. Drop/Delete entire data from InfoCube and then delete Setup Tables and refill these and then run "Repair Full Load", is this they way?
3. What key points should I keep in mind, as different BO reports are already running and using this InfoCube?
4. Should I run "Repair Full Load" requests for all DataSources( 2LIS_03_BX &2LIS_03_BF & 2LIS_03_UM) ?
I will appreciate any input.
Many thanks!
Tariq AshrafHi Tariq,
Unfortunately, you will need some downtime to execute a stock initialization in a live production system. Otherwise you cannot guarantee the data integrity. There are however certainly ways to minimize it. Please see SAP Note 753654 - How can downtime be reduced for setup table update. You can dramatically reduce the downtime to distinguish between closed and open periods. The closed periods can be updated retrospectively.
To make it more concrete, you could consider e.g. the periods prior to 2014 as "closed". You can then do the initialization starting from January 1, 2014. This will save you a lot of downtime. Closed periods can be uploaded retrospectively and do not require downtime.
Re. the marker update,please have a look at the document Re-initialization of the Material Stocks/Movements cube (0IC_C03) with 2LIS_03_BF, 2LIS_03_BX and 2LIS_03_UM in BW 7.x, step 10, 11, 12 and 13 contain important information.
Re. the steps, it looks OK for me but please double check with the document Re-initialization of the Material Stocks/Movements cube (0IC_C03) with 2LIS_03_BF, 2LIS_03_BX and 2LIS_03_UM in BW 7.x.
Please have a look at the following document for more background information around inventory management scenarios:
How to Handle Inventory Management Scenarios in BW (NW2004)
Last but not least, you might want to have a look at the following SAP Notes:
SAP Note 436393 - Performance improvement for filling the setup tables;
SAP Note 602260 - Procedure for reconstructing data for BW.
Best regards,
Sander -
Historic data migration (forms 6i to forms 11g)
Hello,
We have done a migration from Forms 6i to Forms 11g. We are facing a problem with the historic data for a download/upload file
utility. In forms 6i the upload/download was done using OLE Container which has become obsolete, the new technology being webutil.
We had converted the historic data from Long RAW to BLOB (by export/import & by the TO_LOB) and while opening them it throws a
message or not able to open. This issue exists for all types of documents like .doc, .docx, .html, .pdf. We are unable
to open the documents after downloading to local client machines.
One option which works is to manually download the documents (pdf, doc etc) from the older version of forms 6i (OLE) and
upload it to the forms 11g (Webutil). Is there any way this can be automated?
Thanks
RamAre you colleagues?
OLE Containers in Oracle Forms 6i -
HI Experts, I stuck up with one Query. Hope you will help me. I need to extract Historical Data for employees from PA0000 and PA0001
Basically Im reading input file from Flat file using GUI_UPLOAD and storing into internal table t_txt_upload.
Then for all the personal numbers in this internal table need to extract Historical data.
When Im trying to fetch the data using select query its giving Historical. But after READ statement its displaying only a Single record.
can you explain how to fetch Historical records for a personal number.
Can you please guide me. For your clarification attaching the code. Could you please check it and inform me the necessary changes required.
if not t_txt_upload[] is initial. "checking for the data.
sort t_txt_upload by pernr. "sorting uploaded internal table by pernr.
Get the data from table PA0000.
select pernr
begda
endda
massn
from pa0000
into table it_p0000
for all entries in t_txt_upload
where pernr = t_txt_upload-pernr.
if sy-subrc ne 0.
message e063.
endif.
sort it_p0000 by pernr.
**Reading data from PA0001.
select pernr
begda
endda
persk
stell
plans
werks
btrtl
persg
from pa0001
into table it_p0001
for all entries in t_Txt_upload
where pernr = t_txt_upload-pernr.
if sy-subrc ne 0.
message e063.
endif.
sort it_p0001 by pernr.
endif.
Loop at t_txt_upload.
Read table it_p0000 with key pernr = t_txt_upload-pernr binary search.
Read table it_p0001 with key pernr = t_txt_upload-pernr binary search.
if sy-subrc eq 0.
it_final-pernr = t_txt_upload-pernr.
CONCATENATE IT_P0001-BEGDA6(2) IT_P0001-BEGDA4(2)
IT_P0001-BEGDA+0(4) INTO V_BEGIN_DATE
SEPARATED BY '/'.
if sy-subrc eq 0.
it_final-begda = v_begin_date.
endif.
CONCATENATE IT_P0001-ENDDA6(2) IT_P0001-ENDDA4(2)
IT_P0001-ENDDA+0(4) INTO V_END_DATE
SEPARATED BY '/'.
if sy-subrc eq 0.
it_final-endda = v_end_date.
endif.
append it_final.
clear it_p0000.
clear it_p0001.
Endif.
Endloop.Hi user_jedi,
Yes, the data is persisted in the database and is permanent until deleted. The recommended practice is to use the Delete Data Alert Action to periodically purge historical data.
So an overall approach that you could use is:
* Incoming data is written both to BAM for the real time dashboards, and also to a system of record (database, datamart, data warehouse, ...).
* Use a Delete Data Alert Action to periodically purge historical data from BAM.
* If you want to access the historical data from BAM on occasion, you can use an External Data Object. Or use another reporting tool that's optimized for non-real time data.
Regards, Stephen -
Dear experts,
My client wants to bring at least one year of historical data from legacy system to SAP to prvent creating duplicate invoices. ExL an invoice may have already been created in paid in legacy and would like to have the visibility on that.
Also, for 1099 reporting purposes. The client is going in mid of the year and they wanted to bring the data from legacy to sap so that they can do their 1099s in one single place.
can someone help how the other clients are doing this and what is the best practice to bring the historical data?Hi,
If you client want 1 year data to be entered in SAP then you have to upload all the invoices and payment separately in SAP with the corresponding GL will be initial upload account, and after uploading you have to clear the document manually which are already cleared in legacy system.
Example
Invoice
Purchase Ac Dr
TO initial upload Ac Cr
initial upload Ac Dr
To Vendor Ac Cr
Payment
Vendor Ac Dr
To initial upload Ac Cr
initial upload Ac Dr
To Bank Ac Cr
Now you have to clear all the document manually.
Hope this will solve you issue.
Regards,
Shayam
Edited by: Shayam_210 on Aug 25, 2011 6:35 AM -
Hi experts,
Please help me to get the sequence for master data upload in sap pm ......Probably yet another Blog/Document for someone to create..
Here's my offerings:
Finance data: chart of accounts, g/l accounts cost centres, profit centres, activity types, etc
Configuration data
Classes/characteristics
Inspection characteristics
Master Warranties/Permits/Partners/Documents
HR masters
Work centres
Materials
Functional locations
Equipment (serial numbers)
Measurement points/documents
BOMs
Task Lists
Maintenance items/plans
Maintenance plan schedule
Historical orders (if required)
Historical notifications (if required)
Historical Measurement documents (if required)
PeteA -
MP31 - no historical data - how to generate?
Hi specialists,
Mi problem: I have several materials, for which I cannot execute a forecast, because there is "no historical data". The material is very old and there was for sure consumption on it in the past, but before this material was never under MRP.
Is there a way to generate this historical data automatically, maybe even in mass? or do I really have to do that manually material by material?
Any help is appreciated.
Tnx a lot
KurtHi Kurt,
Historical consumption data gathering in the system is not dependent on the material being MRP or not, but on the definitions per movement type, and maintenance of period in one of the plant level views of material master.
I don't know of any procedure in standard SAP that will "rebuild" consumption data. Neverheless, you can always enter your "corrected" historical data, and it will be used by the system for forecasting.
You can do it manually for each material, or upload it from an external file. -
Hi,
We need to migrate historical data such as PO for our client. Our client wants to update the data from 2008 till date. My question to you all is that what should be the MM period i.e. whether the in OMSY the MM period be kept 01/2008 or I can keep it as the current period.
Please suggest.Dear Challenges,
I am assuming that go-live has been done.
For example: If the Posting Periods are of November and December, say go-live is done on 30th November.
Then you can upload PR or PO or any other Puchasing document by mentioning any date, but stock need to be uploaded only on the date before 30th of November. Later GR can be done as of any date in the Posting period range.
For example: If the go-live has been done in some where around August and Posting periods open are of November and December. Same as above case all the Purchasing documents can be uploaded as of any date. But GR need to be done only on the posting periods that are available, But reverting the posting periods has a very worst implications. Kindly refer to OSS notes for that.
If you have any queries, Kindly revert me with full details of your current situation.
Regards
Lakshminath Gunda -
Data upload for vendor balances using BDC
hi abap experts,
I have a requirement on data uploading using BDC.
For the vendor balances ie. for transaction FBL1N ( I was given a template for vendor balance upload and need to write a BDC program for that ) I need upload the exsisting transaction data to the system. is recording necessary for this?
can u pls help me with step by step process for vendor balance uploading.
Thanks,
Hema.Hi
Please follow the following Steps:
Steps for recording:
Step1: Goto TCODE SHDB
Step2: Click on New Recording
Step3: Give the necessary Details such as TOCDE, Desc, ...
Step4: Do the screen by screen recording.(Please avoid extra screen to appear)
Step5: Save the recording.
Step6. Select the recording and click on Program button on toolbar.
Step7: Give the Program name and click on radio button Transfer from recording.
Step 8: It will open a new session with SE38 and a program with the recoding.
Step 9: then just add the basic code for BDC.
Regards,
Lokesh -
Basic Data Upload to MATERIAL Using LSMW is not working
HI All,
we are using LSMW /SAPDMC/SAP_LSMW_IMPORT_TEXTS program to upload the basic text of the material, all steps are executed correctly and shows records are transfered correctly , but the in MM03 the text is not uploading..
EPROC_PILOT - MASTER - TEXT_UPLOAD Basic long text 1line
Field Mapping and Rule
/SAPDMC/LTXTH Long Texts: Header
Fields
OBJECT Texts: Application Object
Rule : Constant
Code: /SAPDMC/LTXTH-OBJECT = 'MATERIAL'.
NAME Name
Source: LONGTEXT-NAME (Name)
Rule : Transfer (MOVE)
Code: /SAPDMC/LTXTH-NAME = LONGTEXT-NAME.
ID Text ID
Source: LONGTEXT-ID (Text ID)
Rule : Transfer (MOVE)
Code: /SAPDMC/LTXTH-ID = LONGTEXT-ID.
SPRAS Language Key
Source: LONGTEXT-SPRAS (Language Key)
Rule : Transfer (MOVE)
Code: /SAPDMC/LTXTH-SPRAS = LONGTEXT-SPRAS.
* Caution: Source field is longer than target field
/SAPDMC/LTXTL Long Texts: Row
/SAPDMC/LTXTL Long Texts: Row
Fields
TEXTFORMAT Tag column
Rule : Constant
Code: /SAPDMC/LTXTL-TEXTFORMAT = 'L'.
TEXTLINE Text Line
Source: LONGTEXT-TEXTLINE (Text Line)
Rule : Transfer (MOVE)
Code: /SAPDMC/LTXTL-TEXTLINE = LONGTEXT-TEXTLINE.
and at last it displaying as follws
LSM Workbench: Convert Data For EPROC_PILOT, MASTER, TEXT_UPLOAD
2010/02/01 - 10:14:25
File Read: EPROC_PILOT_MASTER_TEXT_UPLOAD.lsmw.read
File Written: EPROC_PILOT_MASTER_TEXT_UPLOAD.lsmw.conv
Transactions Read: 1
Records Read: 1
Transactions Written: 1
Records Written: 2
can any one tell us what could be problem
Regards
Channappa SajjanarHi , thanks for your reply,
i run the all the steps .
when i run the program it gives message as follows
Legacy System Migration Workbench
Project: EPROC_PILOT eProcurement Pilot
Subproject: MASTER Master data Upload / Change
Object: TEXT_UPLOAD Basic long text 1line
File : EPROC_PILOT_MASTER_TEXT_UPLOAD.lsmw.conv
Long Texts in Total: 1
Successfully Transferred Long Texts: 1
Non-Transferred Long Texts: 0 -
How to schedule Job for data uploading from source to BI
Hi to all,
How to schedule Job for data uploading from source to BI,
Why we required and how we do it.
As I am fresher in BI, I need to know from bottom.
Regards
Pavneet RanaHi.
You can create [process chain |http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/502b2998-1017-2d10-1c8a-a57a35d52bc8?quicklink=index&overridelayout=true]for data loading pocess and schedule start process to any time/date etc ...
Regadrs.
Maybe you are looking for
-
Developer 6.0 Eap release
Hi, The only Oracle product I have installed is the RDBMS on Solaris. I assume from the Developer documentation that I can install Developer on my NT box even though my server is on Solaris. When I run setup it begins to install some support files bu
-
How to enhance the context menu in Web Applications - BW 7.0
We have to enhance the context menu of several Web Applications and, initially, we based our solution on the paper "How to enhance the context menu of Web Applications", from June, 2002, but the solution described on the paper refers to the BW versio
-
Hi, I have a question. If I call DeleteGlobalRef to delete a reference for the second time, will the JVM ignore the second call or will it leak memory??
-
I bought a D-Link DWA-642 because the Linksys Wireless-N card wouldn't work in my laptop. The D-Link card works fine except it will not connect to the WRT300N at 802.11n speeds. D-Link support told me I need a D-link router, but the only note of this
-
Pls help