Data collection in SEM from ECC 5.0
Hi experts
I need have the information online in my consolidation system, the following is my plan
a) I want to use real time data transfer from FI to EC CS
a) and use the extractor 3EC_CS_1 from ECC to
in SEM BCS what happend with calculation of retained earnings and balance carry forward because in ECC this activities are already executed?
thanks for your help
GLT0 is not the right table for data extraction for BCS since not all the characteristics are available in this table. Further, GLT0 may not be available since this table may be deactivated once the NEW GL is activated due to ouupying redundant data.
You can extract the data from table FAGLFLEXT if you are using new GL functionality using 0SEM_BCS10 extractor. If you are not using FAGLFLEXT then you can extract the data through your SP purpose ledger table. No Sp purpose ledger is used then you can extract the data using BSEG table. But this is the least preferred table due to data volume and performance reasons.
Pl read the How to Configure ....Consolidation business scenario from market place for better understanding of the selection of tables.
Pl read OSS note 935370 which will be very useful
The following links will be very useful in understanding the calculation of RE and BCF functions in SEM BCS
http://help.sap.com/saphelp_sem60ep1/helpdata/en/91/4b721354f68b45987210d89e9d30b3/content.htm
http://help.sap.com/saphelp_sem60ep1/helpdata/en/53/01ad3c99c90455e10000000a114084/content.htm
Hope this helps!!!!!!!!!!!!!!!!!!
Similar Messages
-
By using middleware how data can be passed from ecc to crm?
By using middleware how data can be passed from ecc to crm?
Hello,
If middleware setup is done for the data exchange between ECC and CRM, then you can use customizing objects
(available in txn R3AC3) and business objects (available in txn R3AC1) , and condition/pricing related objects in available in txn R3AC5, which can be used for extracting data from R/3 to CRM. You can also find some upload objects for uploading the data
from CRM to R/3.
You can use transaction R3AS, to perform the initial load of customizing/Biz objects.
Once initial load is done, any delta changes would come automatically to CRM by means of delta load.
You also have request load, using which you can download range of materials/BPs which are missing or needs to be updated in CRM.
If you have any specific questions, let me know.
Hope this helps!
Best Regards,
Shanathala Kudva. -
Data Loading into Infocube from ECC.
Hi,
Can someone help me with the list of steps to be followed for loading data into infocube using Data trasnformation process. Steps to be perfomed in the system.
Also please share any links on various methods of loading data into infocube from ECC.
Thanks!The steps would be *** foloows:-
1) Create the info-objects.
2) Create the infocube. If you a Planning area and the cube is a replica of the PA you can use /SAPAPO/TS_PAREA_TO_ICUBE to create the infocube.
3) You create a source system as a file system.
4) You create the datasource with the relevant fields.
5) You create a transformation for the cube using the datasource you created in step 4.
6) Last step would be to create DTP. When you create the DTP, on the first screen, check the checkbox for "Do not load data from PSA". This will give you additional fields to enter the file path. When this DTP is created and generated, goto the last tab of the DTP to execute.
This will help you to directly pull the data from the file and load it into the cube.
You can also create a Process chain to automate this process.
Hope this helps.
Let me know in case of any more information. -
A software application was developed to collect and process readings from capacitance sensors and a tachometer in a running spin rig. The sensors were connected to an Aerogate Model HP-04 H1 Band Preamp connected to an NI PXI-6115. The sensors were read using AI Config and AI Start VIs. The data was saved to a file using hsdlConfig and hsdlFileWriter VIs. In order to add the capability of collecting synchronized data from two Eddy Current Position sensors in addition to the existing sensors, which will be connected to a BNC-2144 connected to an NI PXI-4495, the AI and HSDL VIs were replaced with DAQmx VIs logging to TDMS. When running identical tests, the new file format (TDMS) produces reads that are higher and inconsistent with the readings from the older file format (HSDL).
The main VIs are SpinLab 2.4 and SpinLab 3.8 in folders "SpinLab old format" and "Spinlab 3.8" respectfully. SpinLab 3.8 requires the Sound and Vibration suite to run correctly, but it is used after the part that is causing the problem. The problem is occuring during data collection in the Logger segment of code or during processing in the Reader/Converter segment of code. I could send the readings from the identical tests if they would be helpful, but the data takes up approximately 500 MB.
Attachments:
SpinLab 3.8.zip 1509 KB
SpinLab 2.4.zip 3753 KB
SpinLab Screenshots.doc 795 KBFirst of all, how different is the data? You say that the reads are higher and inconsistent. How much higher? Is every point inconsistent, or is it just parts of your file? If it's just in parts of the file, does there seem to be a consistent pattern as to when the data is different?
Secondly, here are a couple things to try:
Currently, you are not calling DAQmx Stop Task outside of the loop; you're just calling DAQmx Clear Task. This means that if there were any errors that occured in the logging thread, you might not be getting them (as DAQmx Clear Task clears outstanding errors within the task). Add a DAQmx Stop Task before DAQmx Clear Task to make sure that you're not missing an error.
Try "Log and Read" mode. "Log and Read" is probably going to be fast enough for your application (as it's pretty fast), so you might just try it and see if you get any different result. All that you would need to do is change the enum to "Log and Read", then add a DAQmx Read in the loop (you can just use Raw format since you don't care about the output). I'd recommend that you read in even multiples of the sector size (normally 512) for optimal performance. For example, your rate is 1MHz, perhaps read in sizes of 122880 samples per channel (something like 1/8 of the buffer size rounded down to the nearest multiple of 4096). Note: This is a troubleshooting step to try and narrow down the problem.
Finally, how confident are you in the results from the previous HSDL test? Which readings make more sense? I look forward to hearing more detail about how the data is inconsistent (all data, how different, any patterns). As well, I'll be looking forward to hearing the result of test #2 above.
Thanks,
Andy McRorie
NI R&D -
BPC - Consolidation - Data Loading - Will BS/PL accounts data ONLY be loaded from ECC?
Dear All,
In BPC, when we load data from ECC for Consolidation, my understanding is that we load only BS and PL accounts' data for/by the entity.
Apart from BS and PL data, will there be any data that will have to be loaded into BPC for consolidation?
The following three financial statements -
-Statement of Cash Flow
-Statement of Changes in Equity
-Statement of Comprehensive Income
are actually derived/calculated from the loaded BS and PL data. This is my understanding. Pls. correct me if I am wrong.
Thank you!
Regards,
PeriHi Peri,
Balance sheet, PL and those three financial statements are derived from BS/ PL accounts, however, there should also be "flow" information. Otherwise you won't end up with a correct consolidated cash flow or equity movement. ( or you can prefer to enter flow detail manually)
Second thing is, while getting BS & PL accounts, you will also need trading partner detail, otherwise you won't be able to do the eliminations. (or you can prefer to manually enter trading partner detail for intercompany accounts)
Thirdly, you should also consider other disclosures. (Depending on what standart you are implementing - IFRS, US GAAP, Local Gaap whatever...)
Hope this gives an idea.
Mehmet. -
Custom data for equipment load from ECC to CRM 7.0
Dear All,
Please suggest the steps required to trace the initial/delta load data for equipment from ECC to CRM & carry out the enhancement for missing fields data.
Thanks & Regards
DBHi,
You have to use download object CRM_EQUI_LOAD. First ensure that all the materials and business partners from ECC to CRM are downloaded.
For more details jut go through
http://help.sap.com/saphelp_sm40/helpdata/en/94/657a3b233b8541a18ed80b424bf1f8/frameset.htm
Best Regards,
Rajendra -
Issue in Temp table creation in PCM Databridge and data migration from ecc to PCM using databridge
Hi,
we are working on mapping SAP PCM table structure data and the data which is coming from ECC.
We are using Databridge tool to upload the data for the same,upload from databridge involves
using Specification file to upload the data into SAP PCM.
The problem we are facing is that
The responsiblity center(cost center) in ECC and SAP PCm are different.There is one to many relationship between them
eg 1 ecc cost center-->many SAP PCM cst center,we need to split the line item vale of one ecc cost center into many SAP PCM cost centers
eg C1 cost center from ECC has value 4000 to be split across PCM1,PCM2,PCM3,PCM4 cost centers in SAP PCM
we need to automate using the databridge ,we are stuck up in this task.
Please guide us.
Also we are trying to create temporary table usinf Table temporary command in SPE file of databridge.The purpose of this is to put the ECC data in temp table as per our req and then use the temp table as data source for putting data in PCm as per our requirement.
Please advise us on this approach
Regards
Shrirang
9552334897Schema's SYS,CTXSYS, MDSYS and ORDSYS are Not Exported using exp/expdp
Doc ID: Note:228482.1
I suppose he already installed a software 12c and created a database itseems - So when you imported you might have this "already exists"
Whenever the database is created and software installed by default system,sys,sysaux will be created. -
Load transaction data from ECC to BPC 10.1 using US GAAP starter kit SP3
I need to understand a bit more the process to load transactional data into BPC 10.1. We have US GAAP Starter Kit SP 3. Below is an screenshot from the config guide:
It explains how transactional data can be extracted from ECC system to the SAP Netweaver BW data source 0FI_GL_10 and then transform it
to get it loaded into BPC. The Objects /PKG/FC_C01 and PKG/FC_DS01 are just mentioned here as reference because they come in the RDS for Financial Close and Disclosure Management, which not all companies have.
I believe the upwards data flow should be from Data Source 0FI_GL_10 to Data Store Object 0FIGL_O10 and then to InfoCube 0FIGL_R10. There is also a data flow that goes from 0FI_FL_10 to InfoCube 0FIGL_R10. Can anyone with experience with US GAAP starter kit answer this?
Thank you.Hello, we were able to load actuals to our environment with the US GAAP Starter Kit, SP03. I followed the Operating Guide document up to section 5.2 and ran the Consolidation Data Manager Package with no issue. We are using the A20 Input and A20 Consolidation process flows based on flows F00, F10, F20, F30, F99, etc... According to the documentation, the Statemnet of Cash Flow and Changes in Equity should be automatically calculated and available from the changes
in Balance Sheet accounts once Consolidation is completed . However, when I ran the corresponding reports, they bring no data.
We loaded actual data for the whole 2013 and Jan 2014. Our intention is
to run the first consolidation for Jan 2014. The closing balance for period 12.2013 in flow 99 was copied to flow 00 2014 (opening balance). Flows 20 and 30 were updated with the corresponding balance sheet movements (increase/decrease), according to the delivered controls in the starter kit. However, cash flow is still showing no results.
I found the following text in the operating guide, but I am not clear if I am missing a step. Can you please clarify? This is telling me that I need to copy my 01.2014 opening balance (F00) to 12.2013 closing balance (F99, which is done by the copy opening data manager package, but in the opposite direction, from Y-1 F00 to Y F99) and in addition to also copy that balance to 12.2013 F00 (previous year opening balance)??
"5.2.2 First Consolidation
When operating the consolidation for a given Scope for the first time in the application, it is necessary to populate and process the prior year-end time period for this Scope, in addition to the first required consolidation period. This is because the flows dedicated to the scope change analysis are properly populated by the consolidation engine provided that at least one automatic journal entry has been detected in the consolidation used as opening consolidation, which by default is the prior year end (December, Y-1). To accomplish this, it is recommended that you copy all input-level opening data of the first required consolidation (flow “F00”) into the closing data (“F99”) and opening data (“F00”) of the prior year-end time period, including intercompany breakdowns.
This breakdown by intercompany in the balance sheet triggers automatic eliminations, which are carried over to the closing position. After the consolidation process, the closing data of the prior year
end time period will contain the appropriate source data for the opening data of the following consolidation, notably the opening automatic journal entries. -
Customer classification data from ECC to CRM
Dear Friends,
We have the middleware active between ECC and CRM for customers. Now we have the requirement to download customer classification data (XD03->Extras->Classification) from ECC to CRM. Please suggest me how we can map this data in CRM and what middleware developments are required?
Thanks,
Rajinikanth GAdded new fields using AET and did Middleware enhancements as per the note - Note 736595 - Exchange of EEW fields with R/3 customer master.
Thanks,
Rajinikanth G -
Hello Experts!
We are trying to set up data collection for following scenario:
Typically our order quantity can vary from 1000 - 5000 pcs and we want to create only one SFC per shop order. We want to collect data after a fixed number of pcs are reported, for example, after every 100 pcs reported. So in above example number of iterations for data collection could vary from 10 - 50.
We see that there are two fields "Required Data Entries" & "Optional Data Entries" in the Data Collection parameter detail tab but looks like those are for static values but we want this to change based on order quantity.
Also we noticed another issue with these fields for our scenario, if we use "Required Data Entries" field then user has to collect all the iterations together but that is not possible since we are collecting after reporting a certain qty. If we use "Optional Data Entries" then system allow the user to collect multiple times but the pop-up does not indicate how many iterations are already collected which is confusing for the users.
Has anyone else had any experience with a similar scenario?
Thanks in advance!
-VenkatHello Venkat,
To collect data against the same SFC several times you should enable the corresponding System Rule 'Allow Multiple Data Colelction'. The "Optional Data Entries" rather controls the number of optional entries that you can enter (i.e. you measure temperature of SFC and need to enter several measures for a single Data Collection).
As long as you enable the system rule, it will be up to you when to collect it. But there is no standard functionality to force it after certain Qty of SFC processed. You'll need a customization for it.
Br, Alex. -
SEM-BCS data extractor from ECC general ledger table(s)
We are a utility company working on an SEM-BCS implementation and use the FERC solution. We do not use the new GL. We are trying to extract the transaction data from ECC to a BI virtual remote cube. We cannot use the profit center extractor (0EC_PCA_3) as the profit center tables do not contain any ferc data. We need to be able to extract the transaction data from a general ledger table. We have run into several issues with various extractors we have tried because they donu2019t allow direct access (0FI_GL_4) or are at a summary level and we canu2019t extract group account, trading partner, and transaction type detail (0FI_GL_1). Would you have any suggestions on how to extract general ledger data with the detail information required from ECC to be able to load to a BI virtual remote cube?
We are going forward with getting the natural account detail data using the profit center extractor 0EC_PCA_3, and getting the ferc summary data using the general ledger extractor 0FI_GL_1. With our testing so far, this combination will provide us the data we need in BCS.
-
No master data upload from ECC to GTS
Hello together
today I have opened an OSS Note regarding a problem at our client side.
We implement SAP GTS 7.2 SP08 but I`m not able to upload any master data (customer and/or material) from ECC to GTS. This is my third implementation project. On both projects in the past, I didn`t had such a trouble with this topic in the past.
I searched the whole forum and tried every hint - everything without any success. ;-(
Unfortunately, the OSS Note is in German. But I hope that anyone could help me to find an solution. Thanks to all very much.
Here is the OSS Note:
Keine Initialüberleitung von Kunden- und Materialstämmen OSS-NOTE: 0000637481
Alle notwendigen Supportpackage-Stände für R/3, PlugIn und GTS sind eingespielt gemäss MasterGuide.
R/3 client:-
- RFC Destinationen korrekt eingestellt. Laufen sauber
- Methodenaufrufe korrekt eingestellt
- ALE Verteilmodell korrekt eingestellt
- Änderungszeiger global aktiviert
- Änderungszeiger korrekt eingestellt (siehe unten)
Customer master /SAPSLL/DEBMAS_SLL
Vendor master /SAPSLL/CREMAS_SLL
Material master /SAPSLL/MATMAS_SLL
- Ànderungszeiger den jeweiligen Funktionsbausteinen zugeordnet (siehe unten)
/SAPSLL/DEBMAS_DISTRIBUTE_R3
/SAPSLL/CREMAS_DISTRIBUTE_R3
/SAPSLL/MATMAS_DISTRIBUTE_R3
- Nummernkreise für Änderungszeiger angelegt
GTS client:-
- Logische Systeme angelegt und Gruppen zugeordnet
- RFC Destinationen korrekt eingestellt. Laufen sauber
- Methodenaufrufe korrekt eingestellt
- Partnermapping eingestellt
- Organisationsstrukturen angelegt und Buchungskreis bzw. Werken aus
Vorsystem zugeordnet
- Nummernkreise angelegt
Beispiel:
1. Kundenstamm im Vorsystem hat keine PLZ in den Stammdaten. Bei Übertragung nach GTS wird im GTS-System ein Überleitungsprotokoll erzeugt, mit dem Hinweis, dass Kunde nicht übertragen werden konnte aufgrund fehlender PLZ. Korrektes Systemverhalten.
2. Kundenstamm wurde entsprechend im Vorsystem gepflegt. Bei erneuter Übertragung wird angezeigt 0 von 1 Partner übertragen. Ein Überleitungsprotokoll in GTS wird nun nicht mehr erzeugt.
Auch unter Transaktion SLG1 wird im Vorsystem kein Anwendungs-Log
geschrieben. Ebenfalls gibt Transaktion SM58 keine Fehlerhinweise.
Edited by: Andreas Drees on Jun 29, 2009 3:35 PMHi Sameer,
thanks very much for your instructions. We created the variants and debugged it. Now fe found out, where the error occurs.
The call function for calling GTS "/SAPSLL/API_1006__SYNCH_MASS" gives us following message:
"Not just yet all adress numbers are collected."
Then on the other hand the SAP Basis suppors gaves us a possible solution. They told us:
The problem occurs, because the tables TBD24 and TBD62 are not filled correct.
Please use the following steps to correct this:
- Transaction BD53
- Select Message type (/SAPSLL/CREMAS_SLL, etc.), change mode
- Choose segment (mark an unmarked segemnt)
- save
- Delete the activated segment
- Activate change pointer
- in TA BD60 set up the function module correct (/SAPSLL/DEBMAS_DISTRIBUTE_R3)
The tables should filled correct after these steps. You can check it with report /SAPSLL/PLUGIN_CHECK_R3.
So I we went through these steps - but without any success. I`m getting crazy. ;-(
Regards,
Andreas -
Issue in transfer of data from ECC to APO
Hi All,
I have a requirement of transferring data from ECC to APO. I am using EXIT_SAPLCMAT_001 fro this purpose. The problem is, I need to transfer the data of a field that is not present in cif_matloc but present in /sapapo/matloc.
How should I proceed...Please help....this is an urgent requirement
Thanks & Regards,
SriLalithaHi,
you may want to go to the transaction /SAPAPO/SNP_SFT_PROF
Determine Forecast of Replenishment Lead Time
Use
In this field, you specify how the extended safety stock planning determines
the forecast of the replenishment
lead time (RLT). The following values are available:
Supply Chain
The system determines the RLT forecast using the supply chain structure by
adding the corresponding production, transportation, goods receipt, and goods
issue times. If there are alternative procurement options, the system always
takes the
longest
option into account.
Master Data
The system determines the RLT forecast from the location product master
data.
Master Data/ Supply Chain
First, the system determines the RLT forecast from the location product
master data. If no RLT forecast can be determined, the system determines the
forecast using the supply chain structure (as described under
Supply
Chain
Dependencies
You can retrieve the replenishment lead time forecast yourself by using the
GET_LEADTIME
method of the Business Add-In (BAdI) /SAPAPO/SNP_ADV_SFT.
Replenishment Lead Time in Calendar Days
Number of calendar days needed to obtain the product, including its
components, through in-house
production or external
procurement.
Use
The replenishment lead time (RLT) is used in the enhanced methods of safety
stock planning in Supply Network Planning (SNP). The goal of safety
stock planning is to comply with the specified service level, in order
to be prepared for unforeseen demand that may arise during the replenishment
lead time. The longer the RLT, the higher the planned safety stock level.
Dependencies
The field is taken into account by the system only if you have specified
master data or master data/supply chain in the RLT: Determine
Forecast field of the safety stock planning profile used.
Hope this helps.
The RLT from ECC is in MARC-WZEIT which is transferred to APO in structure /SAPAPO/MATIO field CHKHOR.
May be if you maintain the setting in the profile, you may get the value in RELDT.
Thanks. -
Best practice data source from ECC 6.0 for legal consolidation in BPC NW7.5
Hi there,
after scanning every message in this forum for "data source" I wonder if there isn't any standard approach from SAP to extract consolidation data from ECC 6.0. I have to say that my customer is not using New g/l so far and therefore the great guide "how to get balances from ECC 6.0 ..." does not fully work for us.
Coming from the old world of EC-CS there is the first option to go via the GLT3 table. This option requires clever customization and the need to keep both GLT0 and GLT3 in line. Who has experiences regarding maintenance of these tables in a production environment?
We therefore plan to use data source 0FI_GL_4 which contains all line items to the financial documents posted. Does this approach make sense or will it fail because of performance issues?
Any help is appreciated!
Kind regards,
DierkHi Dierk,
Do you have a need for the level of detail provided by the 0FI_GL_4 extractor? Normally I would recommend going for the basic 0FI_GL_6 extractor, which provides a much more manageable data volume since it only gives the periodic activity and balances as well as a much smaller selection of characteristics. Link: [http://help.sap.com/saphelp_nw70/helpdata/en/0a/558cabb2e19a4db3097b81bba4fd0e/frameset.htm]
Despite this initial recommendation, every client I've had has eventually had a need for the level of detail provided by the 0FI_GL_4 extractor (or the New G/L equivalent - 0FI_GL_14). Most BW systems can handle the output of the line-item extractors without much issue, but you should test using production data and make sure your system sizing takes into account the load.
The major problem you can run into with the line-item extractors is that if your delta somehow gets compromised it can take a very long time (days, sometimes weeks) to reinitialize and this can cause a large load in your ECC and BW system. Also, for the first transport to production, it is important to plan time to initialize the delta.
Ethan -
Unable to extract the data from ECC 6.0 to PSA
Hello,
I'm trying to extract the data from ECC 6.0 data source name as 2LIS_11_VAHDR into BI 7.0
When i try to load Full Load into PSA , I'm getting following error message
Error Message: "DataSource 2LIS_11_VAHDR must be activated"
Actually the data source already active , I look at the datasource using T-code LBWE it is active.
In BI on datasource(2LIS_11_VAHDR) when i right click selected "Manage" system is giving throughing below error message
"Invalid DataStore object name /BIC/B0000043: Reason: No valid entry in table RSTS"
If anybody faced this error message please advise what i'm doing wrong?
Advance thanksECC 6.0 side
Delete the setup tables
Fill the data into setup tables
Schedule the job
I can see the data using RSA3 (2LIS_11_VAHDR) 1000 records
BI7.0(Service Pack 15)
Replicate the datasource in Production in Backgroud
Migrate Datasource 3.5 to 7.0 in Development
I did't migrate 3.5 to 7.0 in Production it's not allowing
When i try to schedule the InfoPakage it's giving error message "Data Source is not active"
I'm sure this problem relate to Data Source 3.5 to 7.0 convertion problem in production. In Development there is no problem because manually i convert the datasource 3.5 to 7.0
Thanks
Maybe you are looking for
-
My imessage signed out and I'm trying to log back in when it is verifyingit says try again so I do and it says it over and over again how do I solve it?
-
While typing out code in DW MX 2004 and 8.0.2, it pulls the closest tag name in a drop box to help along. Everyone knows that. However while coding I saw 2 tags that I have never heard of and cannot find using Google or on the HTML tag list. <announc
-
MAC softphone compatible with Call Manager 4.1.3
I am looking for a Softphone for a MAC that is compatible with Call Manager 4.1.3.
-
Hi, I have noticed some problems with the phones touch screen. 1.when I am scrolling down a Web page, and if I momentarily rest my Thumb on the upper right part of the phone then some times the page gets zoomed out, as if I double tapped it 2.when I
-
hi, i am in desperate need of help. why is it every time i open macromedia dreamweaver it shuts down by its self? can some please help me my virus scan and ad ware protected have black listed the software so it not the problem. please i really need h