Data loading sequence for 0FIGL_014
Hi experts,
Can you explain or brief the data loading sequence for 0FIGL_014?
Regards.
Prasad
Hi,
Following is my system configuration information
Software Component Release Level Highest support Short Description
SAP_BASIS 620 0058 SAPKB62058 SAP Basis Component
SAP_ABA 620 0058 SAPKA62058 Cross Application Component
SAP_APPL 470 0025 SAPKH47025 Logistics and Accounting
SAP_HR 470 0030 SAPKE47030 Human Resources
With the above configuration in R/3, I am not able to find the data source 0FI_GL_14.
Can you please let me know what package to install and how to install?
Regards.
Prasad
Similar Messages
-
Need data loader script for Buyer Creation for R12
Hi ,
Anybody has data loader script for Buyer creation in R12. Please copy paste one line in the reply on this message.
ThanksHi ,
Anybody has data loader script for Buyer creation in R12. Please copy paste one line in the reply on this message.
Thanks -
Check data load performance for DSO
Hi,
Please can any one provide the detials, to check the data load performance for perticular DSO.
Like how much time it took to load perticular (e.g 200000) records in DSO from R/3 system. The DSO data flow is in BW 3.x version.
Thanks,
Manjunatha.Hi Manju,
You can take help of BW statistics and its standard content.
Regards,
Rambabu -
Data Load process for 0FI_AR_4 failed
Hi!
I am aobut to implement SAP Best practices scenario "Accounts Receivable Analysis".
When I schedule data load process in Dialog immediately for Transaction Data 0FI_AR_4 and check them in Monitor the the status is yellow:
On the top I can see the following information:
12:33:35 (194 from 0 records)
Request still running
Diagnosis
No errors found. The current process has probably not finished yet.
System Response
The ALE inbox of BI is identical to the ALE outbox of the source system
or
the maximum wait time for this request has not yet been exceeded
or
the background job has not yet finished in the source system.
Current status
No Idocs arrived from the source system.
Question:
which acitons can I do to run the loading process succesfully?Hi,
The job is still in progress it seems.
You could monitor the job that was created in R/3 (by copying the technical name in the monitor, appending "BI" to is as prefix, and searching for this in SM37 in R/3).
Keep on eye on ST22 as well if this job is taking too long, as you may have gotten a short dump for it already, and this may not have been reported to the monitor yet.
Regards,
De Villiers -
How to identify the maximum data loads done for a particular day?
Hi all,
There is huge volume of data been loaded on a last monday, and then data was deleted from the cubes on the same day. And hence i needs to see which are all the cubes which was loaded wtih lot of records for a particular day,
i happened to look at rsmo, i am unable to see the ods nor the cube data loads,
were do i seet it?
ThanksSee if the table RSSELDONE helps. This will give you the recent data load details. Based on those loads , you can search the targets.
And also check table TBTCO which will give the latest job details. You will have to analyze the same jobs to know what loads were done . Give a selection for date. -
Master Data/transactional Data Loading Sequence
I am having trouble understanding the need to load master data prior to transactional data. If you load transactional data and there is no supporting master data, when you subsequently load the master data, are the SIDs established at that time, or will then not sync up?
I feel in order to do a complete reload of new master data, I need to delete the data from the cubes, reload master data, then reload transactional data. However, I can't explain why I think this.
Thanks, KeithDifferent approach is required for different scenario of data target. Below are just two scenarios out of many possibilities.
Scenario A:
Data target is a DataStore Object, with the indicator 'SIDs Generation upon Activation' is set in the DSO maintenance
Using DTP for data loading.
The following applies depending on the indicator 'No Update without Master Data' in DTP:
- If the indicator is set, the system terminates activation if master data is missing and produces an error message.
- If the indicator is not set, the system generates any missing SID values during activation.
Scenario B:
Data target has characteristic that is determined using transformation rules/update rules by reading master data attributes.
If the attribute is not available during the data load to data target, the system writes initial value to the characteristic.
When you reload the master data with attributes later, you need to delete the previous transaction data load and reload it, so that the transformation can re-determine the attributes values that writes to the characteristics in data target.
Hope this help you understand. -
Data load process for FI module
Dear all,
We are using BI7.00 and in one of our FI data source 0EC_PCA_1 we had data load failure, the cause for the failure was analysed and we did the following
1) deleted the data from cube and the PSA
2) reloaded (full load) data - without disturbing the init.
This solved our problem. Now when the data reconciliation is done we find that there are doubled entries for some of the G/L codes.
I have a doubt here.
Since there is no setup table for FI transactions (correct me if i am wrong), the full load had taken the data which was also present in the delta queue and subsequently the delta load had also loaded the same data
(some g/l which was available as delta).
Kindly provide the funtioning of FI data loads. Should we go for a Down time and how FI data loads works without setup tables.
Can experts provided valuable solution for addressing this problem. Can anyone provide step by step process that has to be adopted to solve this problem permenantly.
Regards,
M.MHi Magesh,
The FI datasources do not involve Setup tables while performing full loads and they do not involve outbound queue during delta loads.
Full load happens directly from your datasource view to BI and delta is captured in the delta queue.
Yes you are right in saying that when you did a full load some of the values were pulled that were also present in the delta queue. Hence you have double loads.
You need to completely reinitialise as the full load process is disturbed. Taking a down time depends on how frequent the transactions are happening.
You need to.
1. Completely delete the data in BW including the initialisation.
2. Take a down time if necessary.
3. Reintialise the whole datasource from scratch.
Regards,
Pramod -
Attribute data load fails for 0RPA_MARM Infoobject
Hi BW experts,
Attribute data load to the infoobject 0RPA_MARM(not a infoprovider) is on yellow status for long time and after it is throwing the below error
Runtime Errors :UNCAUGHT_EXCEPTION
Except : CX_RSR_X_MESSAGE
I checked the forums for this issue but got the solution for target as infoprovider.
Please suggest.
Regards,
Leo.Hi Leo,
Basically loading data into Master data Infoobject and making it InfoProvider are two different things. Loading data in Master data infoobject will be same for all the Infoobjects and when you make them as Infoprovider you will get an advantage of doing reporting directly on top of that Infoobject.
Regards,
Durgesh. -
Data load error for Init.
Hello Gurus,
I am having some problme in loading Init,
In data load i am getting following error messages,
1,System error occurred (RFC call)
2,Activation of data records from ODS object ZDSALE01 terminated
3,No confirmation for request ODSR_45F46HE2FET0M7VFQUSO2EHPZ when activating the ODS object ZDSALE01
4Request REQU_45FGXRHFHIKAEMXL3D7HU6OFR , data package 000001 incorrect with status 5
5,Request REQU_45FGXRHFHIKAEMXL3D7HU6OFR , data package 000001 not correct
6,Inserted records 1- ; Changed records 1- ; Deleted records 1-
Please help me in resolving these errors.Hi,
are you loading a flat file? If yes check if the file is available at the specified path and if you/sap has the authority to access that path/file.
regards
Siggi -
Data loading mechanism for flat file loads for hierrarchy
Hi all,
We have a custom hierarchy which is getting data from a flat file that is stored in the central server and that gets data from MDM through XI. Now if we delete few records in MDM, the data picked in BI will not consist of the records which are deleted. Does it mean that the hierarchy itself deletes the data it consists of already and does a full load or does it mean every time we load the data to the BI, do weu delete the records from the tables in BI and reload?
Also we have some Web Service(gets loaded from XI) text data sources.
Is the logic about updating the hierrarchy records different as compared to the existing web service interfaces?
Can anyone please tell me the mechanism behind these data loads and differentiate the same for above mentioned data loads?create the ODS with the correct keys. And load full loads from the flat files. You can have a cube pulling data from the ODS.
Load data in ODS
Create the cube.
Generate export datasource ( rsa1 > rt clk the ods > generate export Datasource )
Replicate the export ds ( rsa1 > source system > ds overview > search the ds starting with 8 + the ODS name
press the '+' button activate the transfer rules and comm str
create the update rules for the cube with the above infource ( same as '8ODSNAME' Datasource )
create infopackage with intial load (in the update tab)
Now load data to cube
Now load new full loads to ODS
create a new infopackage for delta (in the update tab)
run in infopackage. (any changes / new records will be loaded to cube)
Regards,
BWer
Assing points if helpful. -
Data Loading Error for cube 0TCT_C22
Dear Colleagues,
I am working on BI 7.0 / SP09.
I am loading technical content cube 0TCT_C22 from datasource 0TCT_DS22. Till PSA, there is no problem with data load. But from PSA to data target, data load fails. Went to monitor and it shows teh error "Error calling number range object 0TCTPRCSCHN for dimension D2 ( ). Message no: RSAU023".
I tried to find the SAP notes fro this, but no success. Also checked the Dump and application logs, but nothing is there.
Please advice ASAP.
Regards
PSHi Pank,
I just solved the very similar issue. Try what I did to see if it works for you. For each dimension in each Infocube a number range is created. For some weird reason during the activation of the Infocube the number range for the dimension giving troubles was not created. Look for it in TCODE SNRO and you should find a number range per each dimension in the cube but the one giving you error.
To solve it, (the easiest way I found) just add any characteristic to the problematic dimension. Activate the Infocube. After that, modify again you Infocube and remove the characteristic you just added to leave the dimension how you need it. Activate the Infocube again. By doing that you will force the regeneration of the dimension and with it the number range. You can chek in TCODE SNRO and the number range should be there. Try loading your data again and it should work.
One thing I don't understand is why that number range sometimes is not created during activation
Good luck, I hope you can solve it!!!
Regards,
Raimundo Alvarez -
Data load error for master data
hi
both errors i need to fix manually or any automatic is there
psa i need to edit that record and run again infopackage.
i need keep that infoobject(BIC/ZAKFZKZ ) alpha conversion at infoobject level.
Errors in values of the BOM explosion number field:
Diagnosis
Data record 5584 & with the key '001 3000000048420000 &' is invalid in
value 'BOX 55576\77 &' of the attribute/characteristic 0RT_SERIAL &.
InfoObject /BIC/ZAKFZKZ contains non-alpha compliant value 1Hi Suneel,
I feel that the symbol '&' is cauing the issue in ur data loads..
So try to edit the same at the PSA and then load the data from PSa to the target..
Also u can use RSKC to handle these special Chars..
For more information u can search SDN..
Thanks
Hope this helps -
FDQM Sample data load file for HFM
Hi all,
I have just started working on FDQM 11.1.1.3. I had done integration of HFM application with FDQM. I required to load data to HFM application i don't have any gl file or csv file with hfm dimension . so any one can help me get this file.......
and one more thing.......
i just want to know what is the basic steps i need to perform to load data to HFM application after creating fdm application and integrating with hfm application.
Thanks.....Hi,
After creating fdm application and integrating with hfm application also includes the Target Application name setting in FDM
Now the FDM application is ready with its target set to HFM application
1. You can create an Import format as below for the application using only Custom1 and Custom2 dimensions from 4 available Custom dimensions(you can modify according to your dimensions):
Use '|'(Pipe as delimeter)
Account|Account Desription|Entity|C1|C2|Amount
2. You can create a text file with data like mentioned below by making a combination of members from each dimension specified in import format:
11001|Capex|111|000|000|500000
11002|b-Capex|111|000|000|600000
Note: these dimension memers should be mapped in 'Mappings' section; use the members specified in file as source and select any target member of HFM for them
3. Then tag this import format with any location
4. Import the text file using 'Browse' option
5. Validate the data loaded
Note: If mapping is done for all dimension members used in file, then validation will be successful
6. Export the data
Now you can export and hence load the data to HFM application using Replace/Merge option.
Here you are with the basic steps to load the data from FDM to HFM.
Regards,
J -
Flat file data load errror for Dates
hi
i am loading data thru flat file to Cube Daily.
Data is like:
order no....date1.............date2.......amount
1122..........20090101..20090102.....100
1123..........20090101..20090102.....500
1124..........20090101.......................200
now for some values the Date2 is empty
so when i load the flat file it gives a dump....
"Incorrect value for Date2 on record no. 3"
how to correct this??
there is a good possibility that date2 will be empty for certain recordsthank you very much for your patience.....
are you sure you are on BW 3.5
i cannot do display datasource anywhere !!!!
here are my options that i see in my bw 3.5 systems
if i click on Infosource i see the option of Assign datasource but not display datasource
if i right clik on source system i can see the option of change transfer rules, delete transfer rules etc. but no option of display datasource
if i right click on infopackage i can see option of change,rename,delete etc.
in source systems tab
if i right click on Datasource , i see change transfer rule option & object overview only
when i double click on change transfer rules, i see the option of
datasource/transfer structure and Transfer rules tab only
in datasource/transfer structure i see that output length is 8 and it is characteristics
in object overview i dont see any tab of fields /format
and also communication structure -
LSMW Master Data Load issue for UOM.
Hello Friends,
I hope that someone out there can provide a answer to my question.
I am attempting to use LSMW to load Alternate units of measure via MM02. The Main data is already in the system.
My Flat file has 2 levels.
Header--> contains Material Number
Line --> contains data relevant to each UOM that I want to enter.
When I do the Direct Input Method, I get the following message:
"The material cannot be maintained since no maintainable data transferred"
Here is the format of the flat file.
SOURCEHEAD HEADER
IND C(001) INDICATOR
Identifing Field Content: H
MATNR C(018) Material
SOURCELINE UOM DATA
IND C(001) INDICATOR
Identifing Field Content: L
MAT C(018) MAT
UMREN C(005) Denominator for conversion to base units of measure
MEINH C(003) Alternate Unit of measure sku
UMREZ C(005) numerator for conversion to base units of measure
EAN11 C(018) International Article Number (EAN/UPC)
NUMTP C(002) Category of International Article Number (EAN)
LAENG C(013) Length
BREIT C(013) Width
HOEHE C(013) Height
MEABM C(003) Unit of Dimension for Length/Width/Height
VOLUM C(013) Volume
VOLEH C(003) Volume unit
BRGEW C(013) Gross weight
NTGEW C(013) Net weight
GEWEI C(003) Weight Unit
When I manually process the data, I have no issues.
I am thinking that I may just be missing some piece of information necessary to get this data processed, am can not see what the answer is.
Any Help that can be provided would be great.
Regards,
ChristopherHello,
You need to map BMMH1 along with BMMH6.
Map TCODE& MATNR in BMMH1 with XEIK1 = 'X' (BasicData). Subsequently map fields of structure: BMMH6.
I just made a test now and it works fine for me.
Hope this helps.
Best Regards, Murugesh AS
Maybe you are looking for
-
I am having major problems with my 4 gb mini 1. The i-pod says that thers is 3.2 gb available space, holding 187 songs now. 2. I-tunes says that I have used 3.72 gb and have 47.4 mb free. 3. My library holds 512 songs and has used 2.02 gb of space. I
-
Contacts producing multiple entries as does calendar . How to stop and fix please.
Contacts producing multiple entries as does Calender. How to stop and fix pls?
-
What is the benefit of Transportation planning point ?
Hi Gurus, Need help , i know that Tppt is org. unit , but what is the benefit of Tppt for transportation module? and what function tppt have ? Rudy.
-
Lightroom 5 the link not available
bonjour , je ne suis pas capable d'aller cherche mon # serie dans lightroom 5 (le lien ne fonctionne pas) thank you.
-
HELP NEEDED ASAP! SIMPLE ONE!
I'm a noob so please dont shout at me....! Okay, I have a 1x1 table, and an image....i need tick boxes and image maps ontop of this image. Now I thought I'd have to set the image as a background image in the table, but in doing so, I cant see a way t