Finding fields in R/3 for data extraction VS Business content
Hi all,
I am doing my first BW project after my training and I am finding it difficult to locate some fields that should be part of my report.
The Consultant says Sales Group, Inco terms, Open orders, Account assignment group, Sold to party, ship to party, Payment terms, Document currency and base currency, Complete deliver indicator, Order type and Count of Complete deliveries required should all be in the Order while transmission type and number of transmissions should be in NAST.
I have gone to R/3 - RSA6 to look at datasources and found 2LIS_01_S260 (Sales order) to have most of the fields but not payment terms, complete delivery indicator, open orders base currency and count of complete deliveries required. Also in Sales document header data, these fileds are not there.
Is it a case of not understanding the correct name of the fields if they are described in a different manner in R/3 or I have not done the proper thing? Can I just use an infocube in BW business content to get all the required objetcs -(infocube, infosource, datasource).
Thank you
Hi,
Finding the correct Infoobject and it's mapped field is a job of BIW consultant. But the problem is how to find them and how to frind respective R/3 field and it's table. You will be familar with them as your experience increases.
So you need to list of fields you require and then find out the Datasource/s which can give these fields by going through the Business content help or using LBWE Tcode in R/3.
The datasource you mentioned is obsolate one. i.e 2LIS_01_S260 .
You better to go wtih new DSes like 2LIS_11_VAITM,2LIS_11_V_ITM.
http://help.sap.com/saphelp_bw31/helpdata/en/8f/36f338472b420de10000000a114084/frameset.htm
If allmost all fields are avialable in a DS , then you can check the possibility of enhancing the DS for remaining Fields.
With rgds,
Anil Kumar Sharma .P
Similar Messages
-
How to find out the activation log for a info* from Business content
Hi,
I work in a BW 3.5 environment where a lot of projects are being done simultaneously and a lot of Business content are getting activated. BW often activates a lot of other objects when a certain section of seemingly unrelated Business contents are activated without much caution.
When I look into some of the objects related to my area of the Business content, I find them already activated.
How do I find the activation( who, what time etc ) log history of certain objects like info objects, cubes, ODS, Infosource etc..
This will help me immensely to communicate with various people( who inadvertently activeted them) and find issues quickly.
Thanks
ArunHi,
The tables in which application logs are stored are:
BAL_AMODAL Application Log: INDX table for amodal com
BAL_INDX Application Log: INDX tables
BALC Application Log: Log or message context
BALDAT Application Log: Log data
BALHANDLE Application Log: Lock
object dummy table
BALHDR Application log: log header
BALHDRP Application log: log parameter
BALM Application log: log message
BALMP Application log: message parameter
BALOBJ Application log: objects
BALOBJT Application log: object texts
BALSUB Application log: sub-objects
BALSUBT Application log: Sub-object texts
Within Tcode SLG1 you can query systemwide object activation on various selection criteria.
Ofcourse furthermore for objects already once transported to PRD you can search transport requests or object lists via Tcode SE01.
Kind regards Patrick Rieken -
Ensure field sequence is correct for data for mutiple source structure
Hi,
I'm using LSMW with IDOC message type 'FIDCC2' Basic type 'FIDCCP02'.
I'm getting error that packed fields are not permitted.
I'm getting Ensure field sequence is correct for data for mutiple source structures.
Source Structures
HEADER_STRUCT G/L Account Document Header
LINE_STRUCT G/L Account Document Line
Source Fields
HEADER_STRUCT G/L Account Document Header
BKTXT C(025) Document Header Text
BLART C(002) Document Type
BLDAT DYMD(008) Document Date
BUDAT DYMD(008) Posting Date
KURSF C(009) Exchange rate
WAERS C(005) Currency
WWERT DYMD(008) Translation Date
XBLNR C(016) Reference
LINE_STRUCT G/L Account Document Line
AUFNR C(012) Order
HKONT C(010) G/L Account
KOSTL C(010) Cost Center
MEINS C(003) Base Unit of Measure
MENGE C(013) Quantity
PRCTR C(010) Profit Center
SGTXT C(050) Text
SHKZG C(001) Debit/Credit Ind.
WRBTR AMT3(013) Amount
I have changed PAC3 field for caracters fields of same length to avoid erreur message of no packed fields allowed.
Structure Relations
E1FIKPF FI Document Header (BKPF) <<<< HEADER_STRUCT G/L Account Document Header
Select Target Structure E1FIKPF .
E1FISEG FI Document Item (BSEG) <<<< LINE_STRUCT G/L Account Document Line
E1FISE2 FI Document Item, Second Part of E1FISEG (BSEG)
E1FINBU FI Subsidiary Ledger (FI-AP-AR) (BSEG)
E1FISEC CPD Customer/Vendor (BSEC)
E1FISET FI Tax Data (BSET)
E1FIXWT Extended Withholding Tax (WITH_ITEM)
Files
Legacy Data On the PC (Frontend)
File to read GL Account info c:\GL_Account.txt
Data for Multiple Source Structures (Sequential Files)
Separator Tabulator
Field Names at Start of File
Field Order Matches Source Structure Definition
With Record End Indicator (Text File)
Code Page ASCII
Legacy Data On the R/3 server (application server)
Imported Data File for Imported Data (Application Server)
Imported Data c:\SYNERGO_CREATE_LCNA_FI_GLDOC_CREATE.lsmw.read
Converted Data File for Converted Data (Application Server)
Converted Data c:\SYNERGO_LCNA_FI_GLDOC_CREATE.lsmw.conv
Wildcard Value Value for Wildcard '*' in File Name
Source Structures and Files
HEADER_STRUCT G/L Account Document Header
File to read GL Account info c:\GL_Account.txt
LINE_STRUCT G/L Account Document Line
File to read GL Account info c:\GL_Account.txt
File content:
Document Header Text Document Type Document Date Posting Date Exchange rate Currency Translation Date Reference
G/L Account document SA 20080401 20080409 1.05 CAD 20080409 Reference
Order G/L Account Cost Center Base Unit of Measure Quantity Profit Center Text Debit/Credit Ind. Amount
44000022 1040 Line item text 1 H 250
60105M01 13431 TO 10 Line item text 2 S 150
800000 60105M01 Line item text 3 S 100
60110P01 6617 H 40 Line item text 4 S 600
44000022 ACIBRAM Line item text 5 H 600
The file structure is as follow
Header titles
Header info
Line titles
Line1 info
Line2 info
Line3 info
Line4 info
Line5 info
Could someone direct me in the wright direction?
Thank you in advance!
CurtisHi,
Thank you so much for yout reply.
For example
i have VBAK(Heder structure)
VBAP( Item Structure)
My file should be like this i think
Identification content Fieldnames
H VBELN ERDAT ERNAM
Fieldvalues for header
H 1000 20080703 swapna
Identification content Fieldnames
I VBELP AUART
Fieldvalues for item
I 001 OR
002 OR
Is this format is correct.
Let me know whether i am correct or not -
Steps for Data extraction from SAP r/3
Dear all,
I am New to SAP Bw.
I have done data extraction from Excel into SAP BW system.
that is like
Create info objects > info area> Catalog
--> Character catalog
--> Key catalog
Create info source
Upload data.
create info cube
I need similar steps for data extraction for SAP R/3
1. when data is in Ztables ( using Views/Infosets/Function etc)
2. When data is with Standard SAP using Business Content.
Thanks and Regards,
Gaurav Soodhi,
chk the links
Generic Extraction
https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/84bf4d68-0601-0010-13b5-b062adbb3e33
CO-PA
http://help.sap.com/saphelp_46c/helpdata/en/7a/4c37ef4a0111d1894c0000e829fbbd/content.htm
CO-PC
https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/fb07ab90-0201-0010-c489-d527d39cc0c6
iNVENTORY
https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/f83be790-0201-0010-4fb0-98bd7c01e328
Extractions in BI
https://www.sdn.sap.com/irj/sdn/wiki
LO Extraction:
/people/sap.user72/blog/2004/12/16/logistic-cockpit-delta-mechanism--episode-one-v3-update-the-145serializer146
/people/sap.user72/blog/2005/01/19/logistic-cockpit-delta-mechanism--episode-three-the-new-update-methods
/people/sap.user72/blog/2005/02/14/logistic-cockpit--when-you-need-more--first-option-enhance-it
/people/sap.user72/blog/2004/12/23/logistic-cockpit-delta-mechanism--episode-two-v3-update-when-some-problems-can-occur
/people/sap.user72/blog/2005/04/19/logistic-cockpit-a-new-deal-overshadowed-by-the-old-fashioned-lis
Remya -
Is it possible to delete data selectively from Business content cubes
Dear Experts,
Requesting you to help me out to know, is it possible to delete data selectively from Business content cubes.
When I'm trying to delete selectively from Business content cubes, the background job gets cancelled with ST22 logs stating
A RAISE statement in the program "SAPLRSDRD" raised the exception condition "X_MESSAGE".
Since the exception was not intercepted by a superior program, processing was terminated.
and i tried with few more Technical content cubes but the same thing happens.
Pls let me know how to selectively delete data from Business content cubes if it's possible?.
Thanks in advance for your favorable assistance.
Regards,
Ramesh-Kumar.Hi Ramesh,
Follow below steps for selective deletion:
1. Transaction code: Use the Transaction code DELETE_FACTS.
2. Generate selective deletion program:
A report program will be generated of the given name, here .
3. Selection screen:
Take the deletion program u201CZDEL_EPBGu201D to the transaction code SE38 to see/execute the program.
After executing it will take you to a selection screen:
As we need to carry out deletion selective on Calendar week, we need to get the screen field for the field Calendar week. For this, click on the Calendar week field and press F1.
Click on the technical information button (marked in red box above) you will get below screen:
ABAP program to carry out the Calendar week calculation
Problem scenario: As stated earlier the requirement is to delete the data from the cube based on the calendar week. Thus a code must be developed such that the number of weeks should be taken as input and corresponding calendar week should be determined. This calendar week should be then passed to the deletion program in order to carry out the data deletion from the InfoCube.
Transaction code: Use T-code SE38 in order to create a program.
Logic: Suppose we need to delete the data older than 100 weeks.
a. Get the number of weeks and system date in variables and calculate the total number of days :
lv_week = 100. *number of weeks
lv_dte = sy-datum. *system date
v_totaldays = lv_week * 7. *total days
b. Get the corresponding calendar day from the total days. This is obtained by simply subtracting the total no. of days from the system date.
lv_calday = lv_dte - v_totaldays. *corresponding calday.
c. Now in order to get the calendar week corresponding to the calculated calendar day we must call a function module 'DATE_TO_PERIOD_CONVERT'. This function module takes input as Calendar day and Fiscal year variant and returns the appropriate fiscal period.
Get the sales week time elements
call function 'DATE_TO_PERIOD_CONVERT'
exporting
i_date = lv_calday
i_periv = lc_sales
importing
e_buper = lv_period
e_gjahr = lv_year
exceptions
input_false = 1
t009_notfound = 2
t009b_notfound = 3.
if sy-subrc = 0.
ls_time-calweek(4) = lv_year.
ls_time-calweek+4(2) = lv_period.
endif.
v_week = ls_boots_time-calweek.
Note: We can pass the fiscal year variant which can be obtained from the table T009B.For e.g. here fiscal year variant lc_sales = Z2. LS_TIME will be any table with suitable time units.
d. Now we have obtained the required calendar week in the v_week variable. This calendar week is the week till which we need to keep the data. And older data than this week will be deleted. This deletion will be done by the deletion program
Submitting the Data deletion program for ZEPBGC01 and key field
SUBMIT ZDEL_EPBG WITH C039 LT v_week.
Here the calendar week value is submitted to the deletion program ZDEL_EPBG with the screen field of calendar week.
Hope ... this will help you..
Thanks,
Jitendra -
Creating a DataSource for Data Extraction
I need to extract the planning data from my planning book to an InfoCube. So I am following the procedure described at the following link.
http://help.sap.com/saphelp_scm50/helpdata/en/e0/9088392b385f6be10000000a11402f/content.htm
I am doing fine with 1A, 1B and 1C, but once I click "Execute" (as described in 1D) nothing happens. I am supposed to get a screen with all the fields but I simply go back to the Planning Area screen.
Also, I tried the path: Planning Area - Extras - Data Extraction Tools. On the following screen, I gave the DataSource technical name and clicked "Generate DataSource" button. Again, I get the screen as before and nothing happens.
Can someone please help with this? Thanks for your time.Hi,
First of all ensure that you have got authority to do this - trace with ST01.
Regards,
Alexander -
MOSS 2007- Migration Tools available for data extraction
Hi All,
I am working on MOSS 2007. I need export entire data from sharepoint portal to excel files. The data could be from Lists, Libraries, Content Types, Static Page data etc. The amount of data is huge.
1) Is there any migration tool available for such extraction? Kindly suggest.
2) Also is it possible to search for data in entire web site based on the content type?
Thanks.Hi,
We don't have ready script, you might have to modify one.
Check below URL if they help you,
http://blogs.technet.com/b/heyscriptingguy/archive/2014/02/04/use-powershell-to-create-csv-file-to-open-in-excel.aspx
http://technet.weblineindia.com/web/export-data-to-excel-using-openxml-sdk/2/
https://support.office.com/en-us/article/Share-Excel-data-with-others-by-exporting-it-to-a-SharePoint-site-262ba5a9-d065-4ad6-9d92-c16a10ae8ac8?ui=en-US&rs=en-US&ad=US
https://selecteditemsexport.codeplex.com/
Regards
Prasad Tandel -
How to Extract the Business content Master Data
Hello,
I activated Master Data Business content 0NOTIFICATION_ATTR and 0NOTIFICATION_TEXT on ECC6.0. these are belongs to "Quality Management Master Data"
How to fill the data right now it had 0 records
Do we need to fill the setup tables?
Thankshi roshan,
if u r loading from a flat file ,do follow thw below steps
Uploading of master data
Log on to your SAP
Transaction code RSA1u2014LEAD YOU TO MODELLING
1. Creation of Info Objects
u2022 In left panel select info object
u2022 Create info area
u2022 Create info object catalog ( characteristics & Key figures ) by right clicking the created info area
u2022 Create new characteristics and key figures under respective catalogs according to the project requirement
u2022 Create required info objects and Activate.
2. Creation of Data Source
u2022 In the left panel select data sources
u2022 Create application component(AC)
u2022 Right click AC and create datasource
u2022 Specify data source name, source system, and data type ( master data attributes, text, hierarchies)
u2022 In general tab give short, medium, and long description.
u2022 In extraction tab specify file path, header rows to be ignored, data format(csv) and data separator( , )
u2022 In proposal tab load example data and verify it.
u2022 In field tab you can you can give the technical name of info objects in the template and you not have to map during the transformation the server will automatically map accordingly. If you are not mapping in this field tab you have to manually map during the transformation in Info providers.
u2022 Activate data source and read preview data under preview tab.
u2022 Create info package by right clicking data source and in schedule tab click star to load data to PSA.( make sure to close the flat file during loading )
3. Creation of data targets
u2022 In left panel select info provider
u2022 Select created info area and right click to select Insert Characteristics as info provider
u2022 Select required info object ( Ex : Employee ID)
u2022 Under that info object select attributes
u2022 Right click on attributes and select create transformation.
u2022 In source of transformation , select object type( data source) and specify its name and source system Note: Source system will be a temporary folder or package into which data is getting stored
u2022 Activate created transformation
u2022 Create Data transfer process (DTP) by right clicking the master data attributes
u2022 In extraction tab specify extraction mode ( full)
u2022 In update tab specify error handling ( request green)
u2022 Activate DTP and in execute tab click execute button to load data in data targets.
4. Monitor
Right Click data targets and select manage and in contents tab select contents to view the loaded data. Alternatively monitor icon can be used.
Regards
Ani -
Data Reconciliation Data Sources in Business Content
Can you tell me where we can find these data sources and explain me how to use them? Do we need to define infocube/ods or anything like that to load the data and use report to see the results?
Please explain me with one complete scenario.
Thanks.Data Reconciliation for data sources allows you to ensure the consistency of data has been loaded in to BI is available and used productively there.
It is based on comparision of the data loaded in to BI and the application data in the source system.You can access the data in the source system directly to perform this comparison.
The term Data Reconciliation data source is used for Data sources that are used as a reference for accessing the application data in the source directly and there fore allow you to draw comparison to the source data.
It allows you to check the integrity of the loaded data by for EXAMPLE,comparing the total of a keyfigure in the data store object with the corresponding totals that the virtual providers access directly in the Source system.
Hope it will helps you..... -
Can't create DTP for cube migrated from business content to 2004s
I am in the process of converting 3.5 Business Content InfoProviders to the new model for 2004s.
Master data was not a problem, but Im having a hard time getting InfoCubes to migrate.
I installed 0PUR_C03 from Business Content and replicated the DataSource into the Data Warehouse.
I right-clicked 2LIS_02_S013, selected Additional Functions, then chose Transformation erzeugen (a.k.a. Create Transformation in English).
The transformation is generated with no errors and it was activated. The DataSource is migrated automatically in the process.
Then, I went to add the Data Transfer Process and that is where I got stuck.
I right-clicked the cube and took Create DTP. For source type, I selected datasource. A dialog box then shows the objects associated with the cube, including the converted infosource, but nothing is highlighted for me to choose from, and I cant complete the process of creating the DTP.
Can anyone tell me where I might have gone wrong?u got it...
Message was edited by:
Raman
Message was edited by:
Raman -
Regular expression for date extraction
hi all,
please show me how can i extract date from statment the date has four forms such as
8-8-2008 or 8/8/2008 or 8-agu-2008 or 8.8.2008
the follwing query can extarct only in this form 8/8/2008.
SELECT REGEXP_SUBSTR(' the conference will be on 8/8/2008', '[0-9]{1,}/[0-9]{1,}/[0-9]{2,}') FROM dual;
regards
Ayham
Edited by: Ayham on Jul 29, 2012 10:17 AMSolomon Yakobson wrote:
Slow_Moe wrote:
so why do you keep nagging about 10-11-12 ??Because OP asked help on validating date. In order to do that we need to validate year, month and day. When we see 8/8/2008 we can't figure out is first 8 representing month or day. we know 2008 is year and month is august and day is 8 only because 8 is used in both positions. But OP will have other dates, right? So if 15/8/2008 comes along how can be validate it without knowing if 15 represents day on month. We could say that since 15 > 12 it represents day and therefore 8 represents month. And 8/15/2008 could be validated using same logic. In all such cases we can deduce the actual date. But what about 9/8/2008. We can validate it and tell it is a valid date but we can't tell what that date is. So OP has to clarify what is actually needed - validate date regardless of format but not being able to tell the actual date or validate date based on some format. If latter, OP needs to provide format mask.
SY.Exactly why is it many of you "guru status"-wanna-be-aces always presume everyone else is a complete idiot??
+"But what about 9/8/2008. We can validate it and tell it is a valid date but we can't tell what that date is."+
Wow, really? You must be some kind of genious! Come on now, don't you find it embarrasing to believe
this is something you have to explain people? No? Ok, I'll grant you 100 reward points for that incredible insight.. no, wait, 1000. It was that good. -
Finding parts to repair drive for data recovery
I have had a hard drive on a Powerbook 12" go down. I bought it in 2004, and it seems that the data recovery company I have sent it to are having problems sourcing another drive so they use parts to repair it and recover the data. They seem top have gone to lengths to try and source what they need, but I thought the discussion forum here might have some experience with this. The disk is Fujitsu Model mht2060at, the original 60Gb disk that was in the laptop. It now lives(?) outside of the laptop which has had a replacement disk fitted...
...Any suggestions would be appreciated.Well, as far as they have said, they need to find comparable donor parts to repair the disk in order to recover the data, and so far they have not, in spite of an extensive search.
Also, they were not quoting cheap either. They have given me the option of simply having the drive returned, which I am inclined to accept, or else go and collect it in person. They have after all had 2 months. I think my real problem has been that at the start I was really hesitant in choosing a company as I could verify the quality or value of any of them: I took quite a while to pick one, and I picked this one finally after asking around, and finding there wasn't really anyone who had a better idea than myself.
Now I am in the position of potentially being back where I started, 1 down, all the rest to go, with a carriage fee every time I send it back out, and with the prospect of losing the disk in transit, or stumbling across a completely rogue operation.
Here I am therefore, needing to find a company with an established reputation...
... if there are some testimonials for the company you suggest from other apple users, I would really app[reciate them. I know it isn't cheap. I need to be convinced on the next choice though before I part with another bean, and I would appreciate anyone with advice on what process may take place, and what I should reasonably expect to pay... -
Is a third-party tool required for data extraction to non-SAP system?
Hi,
I want to extract data to non-SAP systems with the open hub service.
If I choose Database tables as open hub destinations, I would like to know if a third-party tool is mandatory and if it's possible to send data directly from database tables in BW to non-SAP systems.
Many thanks
Ellayou can choose a flat file on the application server and then transffer the file from the application server to other server where you want to use.
in case you want to use the table in the open hub service. any other software that can connect to oracle db / the databasee you have and can access the data from the table.
Regards,
Bwer
Assign points if helpful. -
Need help for data extraction from table
Hi all,
i have a requirement where i need to identify fields PERNR,BEGDA,ENDDA,REASN,fmlan in table PA0672 .If all these field are identical for an employee in more than one record, then greatest value of REMAN field out of these records should be displayed only in output.
How would i go about it?
Thanks
Snehasishselect PERNR BEGDA ENDDA REASN REMAN
into table itab
from PA0672.
* delete all but the highest reman value for each pernr
sort itab by pernr reman descending.
delete adjacent duplicates from itab comparing PERNR BEGDA ENDDA REASN. -
Mailbox Export - Filtered for Date Range and Body Content
I have been pulling my hair out with this one and hopefully someone can help.
I need to search a mailbox and export all (sent & received) messages that fit in a date range and also have a specific word in the body. The query I am using doesn't seem to find the term at all, but the messages it does find fit within my date range,
so part of it seems to work. The request was to not use wildcards too, only looking for the specific word in the body. Here is my one-liner...
New-MailboxExportRequest -ContentFilter {(Body -like "Something") -and (Received -gt "07/01/2012") -and (Received -lt "07/31/2012") -or (Sent -gt "07/01/2012") -and (Sent -lt "07/31/2012")} -Mailbox "SearchMailbox"
-Name "Label1" -FilePath \\server\temp\pst\Export.pst
I've also tried to search for a common word like "the" just to make sure it wasn't a case of the term simply not being in mailbox. No dice.Hi,
Agree with Nickeyclayton, you can use (body -like "*something*") instead of (body -like "something").
I recommend you refer to the following article:
Filterable Properties for the -ContentFilter Parameter
Property
Description
Values
Example syntax
Body
This property returns messages that have the specified string within the message body.
String
Wildcard
-ContentFilter {Body -like "*prospectus*"}
Hope this helps!
Thanks.
Niko Cheng
TechNet Community Support
Maybe you are looking for
-
How to get J2SE Adapter?
Hi all, I am planning to use J2SE adapter with NW2004s 7.0. Where do I get the adapter? Thanks, Prabhu.
-
[SAPNW7.0ABAPTrialSP12] error when installing - please help
Hello all, I'm a complete noob in SAP. I tried to install SAPNW7.0ABAPTrialSP12 in a separated partition. but I failed.. I have 3 partition, C (windows), E (data), G (SAP). And put all installer in it.. this drive also my sap install folder. there is
-
what SPAD is used for?
-
Hi all, I have used the Worklist component in my jspx page, and am able to view the task (created through my Human Task in BPEL) in the page. Now, i want some control buttons in my jspx page where in i can either accept or reject the task. How can i
-
Which software do you use on the linux or unix platform
I am learning oracle, so I am wondering, in the real world, which software do you use to administrate oracle in the company, on the linux or unix platform? sqlplus? certainly, I think you can also open the OEM GUI in linux or unix