"Z" tables extraction
Hello all,
Does anyone know how to configure "Z" tables to bring in all records over? Or does it bring over all automatically? What I found for our first test is that, if the "Z" tables has SAP date domain, i.e., ERDATE, it will check for this date against the the beginning date that we specified in the configuration.
Is there a way that we can configure to skip that check for the "Z" tables?
Thanks,
Tai
Hello Tai,
TDMS by default transfers all the Z tables (client dependant) to the target system fully (Without any reduction).
However, if you want to exclude or reduce those tables from transfer in that particular package, please
execute the activities 'Analyze Table Sizes' and activity 'Maintain Table Reduction' in phase system analysis, where you need to choose the option 'Customize', which in turn give you a list of tables. There you can choose the tables to Reduce / Exclude / Mark it for full transfer.
Please let me know if you need any clarification.
Regards,
Srila.
Similar Messages
-
XML Column from table extract to Pipe Delimited Text File
Hi,
I have an XML column with large data in a Table ( Source SQL server Database).
I was asked to extract XML column to .txt file using SSIS.
Is it possible to extract xml column with huge data to text file ?
when I tried, select XML column from Table in source , I noticed that Property of column is taken as [DT_NTEXT] . I Converted it to DT_TEXT as Ansi donot support DT_NTEXT.
Execution method was success but it failed due to trucation. so wondering is there a way to get XML column extracted to Pipe delimited text file?
Is it advisable to do this ? or IS It Valid to export XML in Pipe Delimited File ?
Please Kindly advice
thanks
kodiAre you looking at shredding data within XML nodes and then importing it to text file or are you looking at exporting XML value as is? Also is SSIS a necessity?
If not, You can simply use T-SQL for this along with bcp for this. just use a query like
EXEC xp_cmdshell 'bcp "SELECT CAST(XMLColumn AS varchar(max)) AS Column FROM table" queryout <full file path> -c -S <ServerName> -T -t |'
provided you use trusted connection (windows authentication)
see
http://visakhm.blogspot.in/2013/10/bcp-out-custom-format-data-to-flat-file.html
If you want to shred the data use Xpath functions in the query as below
http://visakhm.blogspot.in/2012/10/shred-data-as-well-as-metadata-from-xml.html
Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs -
Enahancement to 2lis_02_scl, EBAN table extraction & Actual GI date
Hello Guyz
I need to extract 3 important fields into BI:
1. Committed date on the Purhcase Order Schedule Line (Field: DAT01, Table: EKET). This is not available as standard in 2lis_02_scl in LBWE. I can enhance the extractor and populate the field with USER EXIT. But I know that deltas might not get triggerred for this field? Is my assumption correct? If so, how can I extract this field and ensure that deltas will flow into BW?
2. EBAN extraction --> Purchase Requisition Release Date (Field: FRGDT, Table: EBAN): I need to obtain 'Purchase Requisition' information into BW. But there is no standard extractor available in ECC. Am I left with building a genric delta extractor or can someone suggest a better way?
3. Where is the Actual Goods Issue date stored for Purchase orders? In the EKET table we have GI date and GR date but there are planned dates. But we do have Actual GR date in 2lis_02_sgr... but this extractor is only built for Goods Receipts information? So how can I get Goods Issued information?
Please provide an answer to any/all of the above.
Thanks,
Srujan.For your first question, you could do the below.
Assuming that the DataSource is delta capable, you can populate the new field data using 'Repair Full Load':
1. Create a View based on the base tables and EKET. Include the new fields along with the DSO key(s). Create a generic extractor based on the view. Write CMOD code to populate your new field. The CMOD code is as below. Test the extractor in RSA3 to make sure the records are being extracted. Replicate the DS in BW.
2. Create transformation for your Datastore with this new datasource. Create a full load Infopackage. Under the Scheduler tab of Infopackage, select 'Indicate request as repair request'. Schedule the IP. This will populate all historical data for your new fields without harming the deltas.
NOTE: If the record size if huge, you would want to limit the size using some filter criteria. For example, load the data only for a couple of years.
Once DSO is loaded and data activated, Infocube will automatically pick these delta records.
CASE i_datasource.
data: v_dat01 like EKET-DAT01.
l_tabix LIKE sy-tabix.
when '2LIS_02_SCL'.
LOOP AT c_t_data INTO wa_t_data.
l_tabix = sy-tabix.
SELECT SINGLE DAT01 into v_dat01 from EKET where <condition>.
if sy-subrc = 0.
wa_t_data-DAT01 = v_dat01.
MODIFY c_t_data FROM wa_t_data INDEX l_tabix.
endif.
For your second question, building a generic extractor will be ideal. Hope this helps.
-Mann -
Hi,
We have built one generic extractor which takes data from CDPOS and CDHDR cluster table. As number of entries are huge it is taking more time to load the data. We are taking Release status (FRGZU) and release date (FRGDT) from these tables.
Whenever any one release order entry comes to above tables. Now we are searching for other options by which we can take data as generic extraction is really time consuming.
Can anyone please advise me different options. We have one option of creating one table in (R/3) which will contain entry for release status only which means whenever any one release purchase order it will come to this table through user exit.
Help will definetly be rewarded.
Thanks!Hi Burberry BIW Team,
With regard to the generic extractor mentioned above;
did you create a function module?
We have also got a similar situation.
I was wondering if you could help.. not too familiar with ABAP coding and function modules..
Any help will be really appreciated.
Thank you. -
SAP query change the internal table extracted after a join instruction
Hi Gurus
I would like to know if it is possible to change the data in internal table after that the
join instruction has been executed .
I should have to delete some lines extracted if some conditions are satisfied.
Do you think that it is not possible and it is better to crete a program?
Thanks in advanceHI,
use select and end select.
select single msegbukrs msegwerks msegmatnr msegmenge msegaufnr msegsmbln msegbwart msegshkzg mseg~meins
afkorsnum afpoverid
maststlnr maststlal stpoidnrk stpomenge stpomeins stkobmeng
into (itab-mblnr,
itab-budat,
itab-bktxt,
itab-bukrs,
itab-werks,
itab-matnr,
itab-menge,
itab-aufnr,
itab-smbln,
itab-bwart,
itab-shkzg,
itab-meins,
itab-rsnum,
itab-verid,
itab-stlnr,
itab-stlal,
itab-matnr_r,
itab-bdmng,
itab-meins_r,
itab-bmeng)
from mkpf
inner join mseg on msegmblnr eq mkpfmblnr and mkpfmjahr eq msegmjahr
inner join mara on maramatnr eq msegmatnr
inner join afpo on afpoaufnr eq msegaufnr
inner join mast on mastmatnr eq msegmatnr and mastwerks eq msegwerks
inner join afko on afkoaufnr eq msegaufnr and afkostlal eq maststlal
inner join stko on stkostlnr eq maststlnr and stkostlal eq maststlal
inner join stpo on stpostlnr eq stkostlnr
inner join stas on stasstlnr eq maststlnr and stasstlal eq maststlal and stasstlkn eq stpostlkn
where mseg~werks in s_werks
and mseg~matnr in s_matnr
and mkpf~budat in s_budat
and mseg~aufnr in s_aufnr
and mseg~bwart eq '101'
and mast~stlan eq '1'
and stko~stlty eq 'M'
and stpo~stlty eq 'M'.
if condtion .
append: itab.
endif.
clear: itab.
endselect.
Edited by: ZAHID HAMEED on Oct 12, 2011 3:56 PM -
Need seosubcodf table extract from CRM 4.0 system
Hi,
Does anyone has CRM 4.0 system access. I need table SEOSUBCODF extracts.
Do reply me soon.
Thanks,
Saurabh.
Edited by: Saurabh Luthra on Apr 6, 2009 1:25 PMParitosh,
Our dev landscape(on windows environment) looks like:
CRM ABAP - WAS 620 - ABAP box
CRM J2EE - WAS 640 ABAP+J2EE - Java Box
I don't know if you can install the 640 J2EE WAS on the same box as a 620 ABAP Box. However you should be able to use the 640 J2EE environment with CRM(we are using it productive).
We are running non-IDES 4.0SP8.
It would probably be easier if you have two boxes to install the ABAP on one box, and the J2EE on another box, for better performance. I do remember also there are some restrictions on TREX/IPC/etc being on the same box if you are using all the functionality provided. We currently don't use CRM for "pricing-relevant" transactions.
Good luck,
Stephen -
Hi Guys...
I want to extract schema tables into .csv files.
I know for one table is as easy as doing this:
set heading off
set space 0
set pages 0
set linesize 0
set colsep '|';
set echo off
set feedback off
spool my_csv.csv
select /*+ PARALLEL(A,6) */ * from csv_tab;
spool off
exit;
How can I use recreate thus script to include to include all the tables in the schema?
Please Help !!!!!!!!!!!!!!!!!!!!!!!!!!!!!If you want to create a script with all SQL-statements to re-create all tables, you should probably use something like this:
spool yourtables.sql;
declare cursor c1 is select table_name from user_tables;
query varchar2(3000) ;
begin
dbms_output.enable(1000000);
for r1 in c1 loop
query := dbms_metadata.get_ddl('TABLE', r1.table_name);
dbms_output.put_line(query||chr(10)||chr(10));
end loop;
end;
spool off;
HTH
FJFranken -
Best way to coordinate table extracts?
What is the most efficient method of extracting rows from one table that are dependent on the existence of corresponding rows in another table?
For example:
I have a table containing information about WorkOrders with the WorkOrder Number as the PK. I have another table containing the Parts that belong to the WorkOrders. The tables are linked via the WorkOrder Number and is a one-to-many relationship between WorkOrder and Parts.
I want to extract only those rows in my Parts table where I have WorkOrders in my WorkOrder table. Kind of like 'select * from PARTS where WORKORDER in (select WORKORDER from WORKORDERS)'.
I don't want to incur the expense of a JOIN and I don't want to hardcode that much in a FILTER Operator (didn't seem like a best practice), so...
What are the recommendations of the populus?
Thanks very much for your advice.
GaryGary,
One solution would be to have a foreign key in the parts table that points to the PK in the workorders table, eliminating the problem in its root, but I guess this is out of question - you would not be looking for a solution of this if you had not have orphan records in your parts table already.
The second solution is a join - I think the cost of doing a join will be less that checking individually for every part record whether there is an existing order. Properly indexed joins can be very efficient. A filter will not be so efficient in this case - a filter has to be selective.
The third would be to have a view where you design your own expression - the expression with in ... mentioned by you is one of the possibilities.
Regards:
Igor -
JCDS Tables extraction into BW
Hello Punters,
Please tell me how we can extract JCDS table into header table (2LIS_04_P_MATNR).... we have one process order in MATNR, multiple status in JCDS... What would be the best approach?
Thanks in advance
VictorHi,
It depends on what is the business requirement. For instance, lets say you want the active records for a project. Then you can use the combination of "Change NUmber" and "Inactive" field combined to identify what is the latest open statuses for the given object number. And then extract just the same.
Regards
Sreekanth -
Hi,
I'm hoping that someone will be able to explain how I extract the first element (in this case the name) from the hashtable in the server code below (case N:).
import java.net.*;
import java.io.*;
import java.util.*;
public class Server{
private PrintWriter pw;
private BufferedReader bf;
private ServerSocket ss;
private Socket s;
//Hashtable used to store test data
//Set up hash table with test data
private Hashtable emails;
public Server() throws Exception{
emails = new Hashtable();
emails.put("Ince","[email protected]");
emails.put("Roberts","[email protected]");
emails.put("Timms","[email protected]");
emails.put("Rowlands","[email protected]");
emails.put("Eustace","[email protected]");
emails.put("Lord","[email protected]");
System.out.println("...Setting up server socket");
ss = new ServerSocket(1200);
System.out.println("..waiting for connection ");
s = ss.accept();
System.out.println("..connection made");
InputStream is = s.getInputStream();
bf = new BufferedReader(new InputStreamReader(is));
OutputStream os = s.getOutputStream();
pw = new PrintWriter(os, true);
public void Run() throws Exception
boolean cont = true;
while(cont == true)//server runs while true
String clientLine = bf.readLine();
System.out.println( clientLine ); //used to check the connection
switch(clientLine.charAt(0))
case'E':
String Name = "";
String staffName = null;
Name = clientLine.substring(1,clientLine.length());
staffName = (String) emails.get(Name);
pw.println(staffName);
break;
case'N':
String email = "";
String emailAddress = null;
email = clientLine.substring(1,clientLine.length());
emailAddress = (String) emails.get(email);
pw.println(emailAddress);
break;
case'U':
String number = (String) emails.get(clientLine);
if(number == null)
pw.println(number);
break;
case'Q':
pw.close();
bf.close();
s.close();
cont = false;
break;
public static void main(String[]args) throws Exception{
Server serv = new Server();
serv.Run();
}1) Use HashMap if possible, rather than Hashtable.
2) HashMap and Hashtable don't have a "first" element. You can call entrySet(), keySet(), or values() and get the first element from the resulting collection's iteration, but that could be any element from the map--first added, last added, middle. It has no relation to the order added or to any sorting.
If you want to retrieve in the order added, use LinkeHashMap.
If you want to retrieve in a sorted order, use a SortedMap, such as TreeMap.
Make sure you implement equals and hashCode correctly, and if you want to sort, implement Comparable or provide a Comparator.
http://developer.java.sun.com/developer/Books/effectivejava/Chapter3.pdf
Making Java Objects Comparable
http://java.sun.com/docs/books/tutorial/collections/interfaces/order.html
http://www.javaworld.com/javaworld/jw-12-2002/jw-1227-sort.html -
How to insert a picture into a table extracting from another table in a for
i have a form in which i have a image item though which i am saving a picture of datatype of long row or blob.This is working properly.
But when i am saving this picture from screen into another table having same field name and same datatype,
its not working.
Can anybody find out it's solution
om prakash sahoo
ocac,bbsrfind it under the java tutorial... should download n run it locally. its a good tutorial with all the stuff u need to know...
-
MIR4 TABLE EXTRACTION TO EXCEL
Hi All Experts,
Need your valuable input in scripting:
I Want to download the from T.code MIR4 PO Ref Table to Excel. ( Amount, PUrchase Order & Order)
Below Problems:
a) How would the Script read the number of Rows in Table.
b) How would values would be stored & saved in Excel.
I am presently counting the number of rows and doing it manually in SAP GUI. ( The GUI has to count the number of rows in table and copy the details to excel)
Sample:
If Not IsObject(application) Then
Set SapGuiAuto = GetObject("SAPGUI")
Set application = SapGuiAuto.GetScriptingEngine
End If
If Not IsObject(connection) Then
Set connection = application.Children(0)
End If
If Not IsObject(session) Then
Set session = connection.Children(0)
End If
If IsObject(WScript) Then
WScript.ConnectObject session, "on"
WScript.ConnectObject application, "on"
End If
session.findById("wnd[0]/usr/txtRBKP-BELNR").text = strInvRef
session.findById("wnd[0]").sendVKey 0
Cell1 = session.findById("wnd[0]/usr/subHEADER_AND_ITEMS:SAPLMR1M:6005/subITEMS:SAPLMR1M:6010/tabsITEMTAB/tabpITEMS_PO/ssubTABS:SAPLMR1M:6020/subITEM:SAPLMR1M:6310/tblSAPLMR1MTC_MR1M/txtDRSEG-WRBTR[1,0]").Text
Cell2 = session.findById("wnd[0]/usr/subHEADER_AND_ITEMS:SAPLMR1M:6005/subITEMS:SAPLMR1M:6010/tabsITEMTAB/tabpITEMS_PO/ssubTABS:SAPLMR1M:6020/subITEM:SAPLMR1M:6310/tblSAPLMR1MTC_MR1M/txtDRSEG-WRBTR[1,1]").Text
Cell3 = session.findById("wnd[0]/usr/subHEADER_AND_ITEMS:SAPLMR1M:6005/subITEMS:SAPLMR1M:6010/tabsITEMTAB/tabpITEMS_PO/ssubTABS:SAPLMR1M:6020/subITEM:SAPLMR1M:6310/tblSAPLMR1MTC_MR1M/txtDRSEG-WRBTR[1,2]").Text
Cell4 = session.findById("wnd[0]/usr/subHEADER_AND_ITEMS:SAPLMR1M:6005/subITEMS:SAPLMR1M:6010/tabsITEMTAB/tabpITEMS_PO/ssubTABS:SAPLMR1M:6020/subITEM:SAPLMR1M:6310/tblSAPLMR1MTC_MR1M/txtDRSEG-WRBTR[1,3]").Text
Cell5 = session.findById("wnd[0]/usr/subHEADER_AND_ITEMS:SAPLMR1M:6005/subITEMS:SAPLMR1M:6010/tabsITEMTAB/tabpITEMS_PO/ssubTABS:SAPLMR1M:6020/subITEM:SAPLMR1M:6310/tblSAPLMR1MTC_MR1M/txtDRSEG-WRBTR[1,4]").Text
Cell6 = session.findById("wnd[0]/usr/subHEADER_AND_ITEMS:SAPLMR1M:6005/subITEMS:SAPLMR1M:6010/tabsITEMTAB/tabpITEMS_PO/ssubTABS:SAPLMR1M:6020/subITEM:SAPLMR1M:6310/tblSAPLMR1MTC_MR1M/txtDRSEG-WRBTR[1,5]").Text
'Cell7 = session.findById("wnd[0]/usr/subHEADER_AND_ITEMS:SAPLMR1M:6005/subITEMS:SAPLMR1M:6010/tabsITEMTAB/tabpITEMS_PO/ssubTABS:SAPLMR1M:6020/subITEM:SAPLMR1M:6310/tblSAPLMR1MTC_MR1M/txtDRSEG-WRBTR[1,6]").Text
session.findById("wnd[0]").sendVKey 12
set ns1=createobject("WScript.shell")
ns1.AppActivate "Microsoft Excel"
objSheet.Cells(2, 2) = Cell1
objSheet.Cells(3, 2) = Cell2
objSheet.Cells(4, 2) = Cell3
objSheet.Cells(5, 2) = Cell4
objSheet.Cells(6, 2) = Cell5
objSheet.Cells(7, 2) = Cell6
'objSheet.Cells(8, 2) = Cell7Since nobody has given this question a try after this long I will give it a shot: I am not familiar with that transaction (don't even have access to it) but I think that if you are going to end up with your data in Excel anyway then why don't you just download all the data and do the calculations with Excel (the calculations are Excel's strength) with a Pivot Table or something like it.
-
Hi ,
I am creating new vendor master repository .
Now while importing look up table values for country table , in Default import action table , it is showing out of 236 entries 67- match type- None & 169- match type - exact matching.
But since there are no look up table values in country table of MDM initially , why it is showing some records as exact matching.
Logically it should show all 236 records as Match type - None
Please throw some light on this
NileshHi ,
I am creating new vendor master repository .
Now while importing look up table values for country table , in Default import action table , it is showing out of 236 entries 67- match type- None & 169- match type - exact matching.
But since there are no look up table values in country table of MDM initially , why it is showing some records as exact matching.
Logically it should show all 236 records as Match type - None
Please throw some light on this
Nilesh -
Maintenance view: how to read EXTRACT and TOTAL table
Hi, All,
I created a maintenance view, as stated in the documentation, there are 2 internal tables EXTRACT and TOTAL available in running environment. Now I want to read a record from the internal table by using "READ TABLE...."
In my example: the table strucutre has A, B, C 3 fields.
So I tried to use "READ TABLE EXTRACT with key A=' xyz' assigning <fs>", in this case, the syntax check shows error saying that the specified type has no structure and no component can be called.
So how can I search a row in the table EXTRACT and TOTAL by giving a field value? Is there any other way to get data from the table?
Thank you!Hi Yongying,
I know this is an old post, but, may be this is still helpful for you or for others with the same problem.
Just add the option "CASTING" at the end of your READ statement:
"READ TABLE EXTRACT with key A=' xyz' assigning must be full typed or at least, be of the same type of the Z table from which is generated the itab EXTRACT.
Regards,
José Gabriel. -
Set up table data extracted from R/3 not visible in data target of BW
Hai friends,
I am currently working on extracting data from R/3 to BW. I read the several docs given in the forum and did the following:
1) In the LBWE transaction, my extract structure is already active.
2) In SBIW, i went to the filling of set up tables for QM
3) I executed the set up table extraction
4) Then, i checked in RSA3. The extraction was successful.
5) In BW, i replicated the datasource, and in the infopackage, i selected in the
PROCESSING TAB, "PSA and then into Data targets (Package by Package)
6) In UPDATE tab, i selected FULL UPDATE
7) And then i did immediate load.
8) In RSMO, it showed successful. (It showed the same number of records as in the
RSA3 of R/3)
But when i went into the data target (ODS) and checked for its contents, nothing is visible. Why is it so? HAve i skipped any step? Please help.
Regards,
Neha SolankiHai,
U r rite. It is an NW2004 system.This is what is displayed in the status tab in RSMO.
Data successfully updated
Diagnosis
The request has been updated successfully.
InfoSource : 2LIS_05_QE2
Data type : Transaction Data
Source system: QAS678
And i can find no such button as u said.
Regards,
neha Solanki
Maybe you are looking for
-
I can't seem to use the balance left in my i-tunes account without re-entering the code from the original i-tunes gift card. I no longer have the card. How can I purchase music using the balance without the card?
-
HP Officejet Pro 8500a plus Full-full Reset
I have this same problem: http://h30434.www3.hp.com/t5/Other-printing-questions/HP-8500A-premium-a910-The-touch-screen-will-re... i tried partial reset and semi-full reset but doesnt work for me,i hope someone knows a way to do a full-full reset lik
-
Can not log in to Creative Cloud. Repeatedly get "You've been signed Out" ... Rebooted, reinstalled, changed password. God, I am reallly starting to hate this... [ Private information removed by moderator ]
-
Low performance with SATA on 875P Mobo
I have a 160GB WD with 8MB buffer un SATA and a 80GB WD with 2MB buffer un Paralel ATA. In Sandra I get 25000 kb/s with the first one and 31000 kb/s with the second one. Why do i get such a low score with the SATA hard drive? The SATA is conected t
-
Selective Coloring in Lightroom 4
I just got Lightroom 4 last night. When I was in college, I used Illustrator and, we learned how to do selective coloring. I have no clue what I'm doing in lightroom 4 to do this. The set up is way more intricate than I anticipated. I've searched for