JEST Table Data Extraction
Hi Gurus,
I am using Z transactional datasource (ZJEST) created on JEST Table with Full Load.JEST Table has only 4 fields:
CHGNR Change number
INACT Indicator: Status Is Inactive
OBJNR Object number
STAT Object status
This table has around 18 million records.Each time i load data i have to extract around 18 million records.
Is there any alternate way to avoid loading these huge data every time.Also this table has no date field not extract data based on time.
Can anyone suggest me with ur valuable inputs.
Thank in Advance
Surya.
Hi ,
Hi All, If you check the JEST table there is uniqueness for the Records are updated at the table level.
I think we can't use OBJNR .May be we can with STAT field.
Other way:
From JEST you can extract Full load bacuse there is no delta specific field.
Create a custom table as ZTEST with additional field timestamp.So that you can fetch only updated data into DS level.
Now the issue is how the ZTable will be populated:
Easiest way is with the help of ABAP we can populate from standard to ZJEST every day but when updating we will maniantain Timestamp.So you can extract only those records.
(or)
Check How the JEST table is populating by the Functional consultants is there any idention for how many records updated today.But the issue is u r is FULL even though you know the logic no useful.So u need to convert into Delta.
Regards
Ram.
Edited by: Ramakanth Deepak Gandepalli on Dec 10, 2009 11:05 AM
Similar Messages
-
Set up table data extracted from R/3 not visible in data target of BW
Hai friends,
I am currently working on extracting data from R/3 to BW. I read the several docs given in the forum and did the following:
1) In the LBWE transaction, my extract structure is already active.
2) In SBIW, i went to the filling of set up tables for QM
3) I executed the set up table extraction
4) Then, i checked in RSA3. The extraction was successful.
5) In BW, i replicated the datasource, and in the infopackage, i selected in the
PROCESSING TAB, "PSA and then into Data targets (Package by Package)
6) In UPDATE tab, i selected FULL UPDATE
7) And then i did immediate load.
8) In RSMO, it showed successful. (It showed the same number of records as in the
RSA3 of R/3)
But when i went into the data target (ODS) and checked for its contents, nothing is visible. Why is it so? HAve i skipped any step? Please help.
Regards,
Neha SolankiHai,
U r rite. It is an NW2004 system.This is what is displayed in the status tab in RSMO.
Data successfully updated
Diagnosis
The request has been updated successfully.
InfoSource : 2LIS_05_QE2
Data type : Transaction Data
Source system: QAS678
And i can find no such button as u said.
Regards,
neha Solanki -
Adding data in internal table using extracted data
Hi Experts,
Good day!
I have a requirements in our historical data of material price which adding data that are not existing based on data extracted from standard tables such as A004 and KONP.
Now, i need to use the VALIDFROM (DATAB) value as basis for the latest price.
To make it more clear, see the example below:
Extracted data:
Material Number Valid From Valid to Price
100101 01/01/2008 02/01/2008 100.00
100101 02/02/2008 04/02/2008 100.00
100101 04/03/2008 08/01/2008 200.00
100101 08/02/2008 01/31/2009 300.00
100102 05/02/2008 07/01/2008 10.00
100102 07/02/2008 10/31/2008 15.00
100102 11/01/2008 01/31/2009 20.00
Output:
Material Number Calmonth Price
100101 01/2008 100.00
100101 02/2008 100.00
100101 03/2008 100.00
100101 04/2008 200.00
100101 05/2008 200.00
100101 06/2008 200.00
100101 07/2008 200.00
100101 08/2008 300.00
100101 09/2008 300.00
100101 10/2008 300.00
100101 11/2008 300.00
100101 12/2008 300.00
100101 01/2009 300.00
100102 05/2008 10.00
100102 06/2008 10.00
100102 07/2008 15.00
100102 08/2008 15.00
100102 09/2008 15.00
100102 10/2008 15.00
100102 11/2008 20.00
100102 12/2008 20.00
100102 01/2009 20.00
Text that are in bold are the added data. What is the best code to do with this?
How can i come up with this output? Help me please
Thanks and Godbless,
nipsHi Nips,
Logic shud b sumthing on similar lines
lv_count = 1.
Loop at itab into watab.
if lv_count > 12.
lv_count = 1.
endif.
if watab-date+0(2) = lv_count.
append watab to gt_output.
continue.
else.
concatenate lv_count sy-datum+4(4) into watab-date.
append watab to gt_output.
endif.
endloop.
Best regards,
Prashant -
How can I join 3 tables while extracting data from SAP R/3?
I have 3 tables with the following columns
Emp table (emp)
emp_id
emp_name
emp_add
Dept table (dept)
dept_id
dept_name
dept_loc
Location table (loc)
loc_id
loc_name
Now. If I want to select data from loc_id = 10 and emp_id between 2000 and 3000
How to join these three tables while extracting data from R/3
join condition
loc.loc_id = dept.loc_id
and dept.dept_id = emp.dept_id
and loc.loc_id =10
and emp.emp_id between 2000 and 3000.
Could any one let me know the procedure to extract this data into BW system.Hi,
shouldn't your join condition be:
loc.loc_id = dept.DEPT_LOC
and dept.dept_id = ??
If you can join the three tables then create a generic datasource (RSO) based on a view (create your view with your join in SE11).
Enable the loc_id and the emp_id as selectable in the datasource so you can then select the values from a BW IPack.
hope this helps...
Olivier. -
Data Extraction from SAP R/3 Table View in BW
Dear All,
I am trying to create a BW query from SAP R/3 Table View.
Till now i have created a table view
Extracted Data source in R/3 Useing Tcode : RSA3
In BW Side
1. I have replicated the data source in Source system
2. Created info object
and info object catalog
and assigned it
Now i am little confused over what to do next
wheather to create ODS object in info cube,
Please guide me with rest of steps to follow.
Regards,
GauravHi,
After replicating DS in BW, you have to maintain ODS or Cube for maintaining data in BW.
After loading data into Data Targets you can create Reports based on those Data Targets.
For better reporting performance you can create Cube rather ODS.
Info Source(After Creating info Objects, you have to create a Transfer Structure to map the Fields.)
Then create a Data Target ODS/Cube.
Update rules
Info Package
Hope it helps you .
Let us know if you still have any issues.
Reg
Pra -
Hi all,
We have a daily delta extraction setup for DataSource 0FI_GL_4 from R/3 into ODS. I think there is a table on R/3 side where we could RESET the extraction date (?) to limit the delta reload incase something goes wrong with delta process. Without this reset, it may take 2-3 WEEKS for us.
Is any one has any idea about this table , PLEASE....
Thanks.hi Venkat,
check oss note 485958-BW-BCT-FI: Repeating delta extractions
Symptom
This note provides the option of repeating delta extractions under certain conditions.
The note is valid for the DataSources listed below:
0FI_AP_4 - Accounts Payable: Line Items
0FI_AR_4 - Accounts Receivable: Line Items
0FI_GL_4 - General Ledger: Line Items
0FI_TX_4 Taxes: Line Items
0FI_AP_6 Accounts Payable: Transaction Figures Using Delta Extraction
0FI_AR_6 Accounts Receivable: Transaction Figures Using Delta Extraction
0FI_GL_6 - General Ledger: Transaction Figures Using Delta Extraction
0FI_AP_7 - Accounts Payable: Special G/L Transaction Figures Using Delta Extraction
0FI_AR_7 - Accounts Receivable: Special G/L Transaction Figures Using Delta Extraction
0FI_GL_7 - General Ledger: Cost of Sales Accounting Ledger Using Delta Extraction
0FI_AR_8 - Accounts Receivable: Credit Management: Central Data Using Delta Extraction
0FI_AR_9 - Accounts Receivable: Credit Management: Control Area Data Using Delta Extraction
0FI_AR_10 - Accounts Receivable: Payment History Using Delta Extraction
Other terms
BWOM2_TIMEST, TIMESTAMP, BWOM_SETTINGS, LAST_TS, delta, init
Reason and Prerequisites
For test or correction purposes you require a function to repeat delta extractions.
In the standard system, you can repeat the delta extractions for the last sixty days.
The number of days is defined by the 'DELTIMEST' parameter in the BWOM_SETTINGS table.
For all DataSources mentioned above you can repeat the delta extraction without problems because all DataSources use an ODS object. The system automatically filters out data that is selected several times.
Solution
The solution consists of resetting the time stamps used for the delta extraction in the BWOM2_TIMEST table to a certain date.
To do this, you must reset the 'X' marker in the 'LAST_TS' column for the last successful delta extraction.
The next delta extraction then loads all data created or changed as of this date to the BW system.
To reset the 'X' marker in the 'LAST_TS' column, use the 'BWOM2_TIMEST' report. If the report is not yet available in you system, you can install it as described in Note 836288.
Caution - important note
If you repeat several delta extractions in a run using this note, the number of the extracted records will in most cases be smaller than the total of the individual delta extractions.
This can happen in the following cases:
If documents are contained in several individual delta extractions due to document changes. The system selects these documents only once.
If you have set parameter BWFIOVERLA = X in the BWOM_SETTINGS table. (See Note 485958)
In this case, the upper limit of the period for the data selection is not, as usual, 23:59:59 of the previous day. The system also extracts all existing documents of the current day. These documents are also contained in the following delta extraction.
1. If you can load the expected data volume in one delta extraction, proceed as follows:
Execute the 'BWOM2_TIMEST' report for the relevant DataSource.
Remove the marker in the entry of the last delta extraction.
In the columns for the date of the time stamp and the time of the time stamp, you can see up to when the individual data extractions have extracted data.
Select the entry up to which the data in your BW system is still correct.
Save the changes.
Do not make any further changes in the BWOM2_TIMEST table.
The next delta extraction uses the date + 1 of the selected entry as a lower limit for the data selection and the current date - 1 for the upper limit.
2. If you must load the expected data volume in several parts, proceed as follows:
Caution: There are the following two cases:.
a) The 'BWFIOVERLA' parameter in the BWOM_SETTINGS table is not set.
Execute the 'BWOM2_TIMEST' report for the relevant DataSource.
Remove the marker in the entry of the last delta extraction.
In the columns for the date of the time stamp and the time of the time stamp, you can see up to when the individual data extractions have extracted data.
Select the entry up to which the data in your BW system is still correct.
Save the changes.
Do not make any further changes in the BWOM2_TIMEST table.
The next delta extraction uses the date + 1 of the selected entry as a lower limit for the data selection.
You can control the upper limit for the data selection by changing the ' BWFISAFETY' parameter in the BWOM_SETTINGS table. The upper limit is determined using the current date minus the value of the 'BWFISAFETY' parameter (default = 1).
If you want to extract the data of the last twenty days in four parts, for example, set the parameter as follows in Transaction SE16:
before the 1. delta extraction > > BWFISAFETY = 15
before the 2. delta extraction > > BWFISAFETY = 10
before the 3. delta extraction > > BWFISAFETY = 5
before the 4. delta extraction > > BWFISAFETY = 1 (= default)
b) The 'BWFIOVERLA' parameter in the BWOM_SETTINGS table is set (=X).
First, deactivate the 'BWFIOVERLA' parameter in the BWOM_SETTINGS table. This is necessary because if you do not do this, the system ignores the settings of the 'BWFISAFETY' parameter.
Execute the 'BWOM2_TIMEST' report for the relevant DataSource.
Remove the marker in the entry of the last delta extraction.
In the columns for the date of the time stamp and the time of the time stamp you can see up to when the individual data extractions have extracted data.
Do NOT select the entry up to which the data in your BW system is still correct, rather select the entry before that.
Save the changes.
Do not make any further changes in the BWOM2_TIMEST table.
The next delta extraction uses the date + 1 of the selected entry as a lower limit for the data selection.
You can control the upper limit for the data selection by changing the ' BWFISAFETY' parameter in the BWOM_SETTINGS table. The upper limit is determined using the current date minus the value of the 'BWFISAFETY' parameter (default = 1).
If for example you want to extract the data of the last twenty days in four parts, set the parameter in Transaction SE16 as follows:
before the 1. delta extraction > > BWFISAFETY = 15
before the 2. delta extraction > > BWFISAFETY = 10
before the 3. delta extraction > > BWFISAFETY = 5
before the 4. delta extraction > > BWFISAFETY = 1 (= default)
and parameter BWFIOVERLA = X to return to the originaldelta mode. -
Performance Tunning- data extraction from FMGLFLEXA table
Hi,
Need to fetch data from FMGLFLEXA table based on below condtion.
WHERE rfund IN s_rfund
AND rgrant_nbr IN s_rgnbr
AND rbusa IN s_rbusa
AND budat LE v_hbudat.
Please tell me how can i optimize this extaraction b'coz in production system there are lacks of records.
Regards,
Shweta.create a index on these fields due to which data extraction from table will be fast.
-
HR tables structure for BW data extraction
Hi Experts,
How the HR tables structure are different from other SAP tables in BW data extraction perspective.
Appreciate your help in this regard.
Thanks,
Varun.The 1st and great difference is the time dependency of HR 'Infotype' and so when you load master data, it is in great part time dependent!
Message was edited by: Claudio Caforio -
Extracting Multiple Table Data Dynamically..Table is an Input parameter
Hi All
Can any one update the Program/design of extracting multiple table data as a list or to an excel sheet
For eg:- Mutliple Tables are entered in selection screen and upon executing the output should be to an excel sheet sequenctially according to the table names entered in the selection screen
Awaiting for your update
Regards
Shiva
Edited by: shivakumar bandari on May 29, 2009 9:35 PMHI Naimes
Thanks for youe reply
your second step says to select data from 'table' here tables are dynamic in nature I mean they are an input parameters..how can I write select on a table..
I can get the table existence from DD02L table and pass the retrieved tables to FM Get_Component_List to get the fields existing for the particular table,
But I cannot pass the dynamic table value to select query to retrieve the data..Please update me if you have anything on retrieving data from dynamically inputted value
Can I do this
Select * from <dyntable>
Any suggestions will be appreciated
thank you -
Data Extraction from AR Aging Tables to Acess
Hi
I used to work on developing the reports.But I am a new to the Data Extraction from AR Aging Tables into Acess and the data is upload from Acess to SAP.
Can anybody help me to relove this issue.I really appreciate to you.After mapping then the data is loaded to SAP.Hi
I used to work on developing the reports.But I am a new to the Data Extraction from AR Aging Tables into Acess and the data is upload from Acess to SAP.
Can anybody help me to relove this issue.I really appreciate to you.After mapping then the data is loaded to SAP. -
Hi
i am having a table data...where i want t read that data and put it in another desired format:
Here is the actual data:
sno Description Qty. Units Rate (Rs.) Value (Rs.)
1 Aviation Lamp 1 Nos 3700 3,700
2 Lighting Arrester 1 Nos 1600 1,600
3 Aviataion Lamp Cable 31 Rmt 270 8,370
Gross Total 13,670
The above Total include Services Tax Amount of Rs 536
Total Invoice Value I 13,670
Now Claimed 100% of Invoice Value 13,670
(Rupees Thirteen Thousand Six Hundred and Seventy only)
My Desired ouput should be in
S.No 1
Description Aviation Lamp
Qty. 1
Units Nos
Rate (Rs.) 3700
Value (Rs.) 3700
S.No 2
Description Lighting Arrester
Qty.1
Units Nos
Rate (Rs.) 1600
Value (Rs.) 1600
S.No 3
Description Aviataion Lamp Cable
Qty.31
Units Nos Rmt
Rate (Rs.) 270
Value (Rs.) 8,370
Gross Total 13,670
Tax Amount of Rs 536
Total Invoice Value 13,670
i have written code like this:
import java.io.BufferedReader;
import java.io.FileReader;
public class TabDeli {
String LineFeed=null,line=null,Output=null; int ctr=0;
BufferedReader in = new BufferedReader(new FileReader("D:\\tab2.txt"));
while ((line = in.readLine()) != null) {
LineFeed += line;
ctr++;
String substring;
LineFeed = LineFeed.substring(4);
int ptr = ctr-4; String LineFeed2 = LineFeed;
for (int i=1; i<ptr;i++){
try{
Output +=("S.No " + LineFeed2.substring(0,LineFeed2`.IndexOf("$")+"/n"));
LineFeed2 = LineFeed2.substring(IndexOf("$")+1);
Output += "Description "
+ LineFeed2.substring(0,LineFeed2.IndexOf("$")+"\n";
LineFeed2 = LineFeed2.substring(IndexOf("$")+1);
Output += ("Qty. " + LineFeed2.substring(0,LineFeed2.IndexOf("$")+"\n");
LineFeed2 = LineFeed2.substring(IndexOf("$")+1);
Output += ("Units " + LineFeed2.substring(0,LineFeed2.IndexOf("$")+"\n"));
LineFeed2 = LineFeed2.substring(IndexOf("$")+1);
Output +=( "Rate (Rs.) "
+ LineFeed2.substring(0,LineFeed2.IndexOf("$")+"\n"));
LineFeed2 = LineFeed2.substring(IndexOf("$")+1);
Output += ("Value (Rs.) "
+ LineFeed2.substring(0,LineFeed2.IndexOf("$")+"\n"+"\n"));
LineFeed2 = LineFeed2.substring(IndexOf("$")+1);
} catch (Exception A){
System.out.println("Error in Line Number: " + ptr+1);}
LineFeed2 = LineFeed2.substring(4); //removing the null =)
//Adding the Last Line
LineFeed2 = LineFeed2.substring(IndexOf("$")+1);
Output += ("Gross Total " + LineFeed2.substring(0,LineFeed2.IndexOf("$")));
LineFeed2 = LineFeed2.substring(IndexOf("$")+1);
Output += ("Tax Amount of Rs "
+ LineFeed2.substring(0,LineFeed2.IndexOf("$")));
LineFeed2 = LineFeed2.substring(IndexOf("$")+1);
Output +=( "Total Invoice Value "
+ LineFeed2.substring(0,LineFeed2.IndexOf("$")));
LineFeed2 = (LineFeed2.substring(IndexOf("$")+1));
//Here is the output you Desire
System.out.println(Output);
}I think a simplier approach is to add a Delimeter. Then Cut the substring before the Delimeter. You can also check if there is a missing part in the line by counting the delimeters. Here is my suggestion.
S.No Description Qty. Units Rate (Rs.) Value (Rs.)
1$Aviation Lamp$1$Nos$3700$3,700$
2$Lighting Arrester$1$Nos$1600$1,600$
3$Aviataion Lamp Cable$31$Rmt$270$8,370$
Gross Total $13,670$
The above Total include Services Tax Amount of Rs $536$
Total Invoice Value $13,670$
thats how should it look. You can change the '$' sign if you like.
Then, extract all the content of the file into a single String.
i am unable to process it...can any one give me idea where i went wrong??Really ambitious multi-poster.
[http://forum.java.sun.com/thread.jspa?threadID=5288046]
[http://forum.java.sun.com/thread.jspa?threadID=5295109]
[http://www.java-forums.org/new-java/8338-how-achieve.html]
[http://regexadvice.com/forums/thread/41546.aspx] -
Goldengate Extracts reads slow during Table Data Archiving and Index Rebuilding Operations.
We have configured OGG on a near-DR server. The extracts are configured to work in ALO Mode.
During the day, extracts work as expected and are in sync. But during any dialy maintenance task, the extracts starts lagging, and read the same archives very slow.
This usually happens during Table Data Archiving (DELETE from prod tables, INSERT into history tables) and during Index Rebuilding on those tables.
Points to be noted:
1) The Tables on which Archiving is done and whose Indexes are rebuilt are not captured by GoldenGate Extract.
2) The extracts are configured to capture DML opeartions. Only INSERT and UPDATE operations are captured, DELETES are ignored by the extracts. Also DDL extraction is not configured.
3) There is no connection to PROD or DR Database
4) System functions normally all the time, but just during table data archiving and index rebuild it starts lagging.
Q 1. As mentioned above, even though the tables are not a part of capture, the extracts lags ? What are the possible reasons for the lag ?
Q 2. I understand that Index Rebuild is a DDL operation, then too it induces a lag into the system. how ?
Q 3. We have been trying to find a way to overcome the lag, which ideally shouldn't have arised. Is there any extract parameter or some work around for this situation ?Hi Nick.W,
The amount of redo logs generated is huge. Approximately 200-250 GB in 45-60 minutes.
I agree that the extract has to parse the extra object-id's. During the day, there is a redo switch every 2-3 minutes. The source is a 3-Node RAC. So approximately, 80-90 archives generated in an hour.
The reason to mention this was, that while reading these archives also, the extract would be parsing extra Object ID's, as we are capturing data only for 3 tables. The effect of parsing extract object id's should have been seen during the day also. The reason being archive size is same, amount of data is same, the number of records to be scanned is same.
The extract slows down and read at half the speed. If normally it would take 45-50 secs to read an archive log of normal day functioning, then it would take approx 90-100 secs to read the archives of the mentioned activities.
Regarding the 3rd point,
a. The extract is a classic extract, the archived logs are on local file system. No ASM, NO SAN/NAS.
b. We have added "TRANLOGOPTIONS BUFSIZE" parameter in our extract. We'll update as soon as we see any kind of improvements. -
BODS 3.1 : SAP R/3 data extraction -What is the difference in 2 dataflows?
Hi.
Can anyone advise as to what is the difference in using the data extraction flow for extracting Data from SAP R/3 ?
1)DF1 >> SAPR/3 (R3/Table -query transformation-dat file) >>query transformation >> target
This ABAP flow generates a ABAP program and a dat file.
We can also upload this program and run jobs as execute preloaded option on datastore.
This works fine.
2) We also can pull the SAP R/3 table directly.
DF2>>SAPR/3 table (this has a red arrow like in OHD) >> Query transformation >> target
THIS ALSO Works fine. And we are able to see the data directly into oracle.
Which can also be scheduled on a job.
BUT am unable to understand the purpose of using the different types of data extraction flows.
When to use which type of flow for data extraction.
Advantage / disadvantage - over the 2 data flows.
What we are not understanding is that :
if we can directly pull data from R/3 table directly thro a query transformation into the target table,
why use the Flow of creating a R/3 data flow,
and then do a query transformation again
and then populate the target database?
There might be some practical reasons for using these 2 different types of flows in doing the data extraction. Which I would like to understand. Can anyone advise please.
Many thanks
indu
Edited by: Indumathy Narayanan on Aug 22, 2011 3:25 PMHi Jeff.
Greetings. And many thanks for your response.
Generally we pull the entire SAP R/3 table thro query transformation into oracle.
For which we use R/3 data flow and the ABAP program, which we upload on the R/3 system
so as to be able to use the option of Execute preloaded - and run the jobs.
Since we do not have any control on our R/3 servers nor we have anyone on ABAP programming,
we do not do anything at the SAP R/3 level
I was doing this trial and error testing on our Worflows for our new requirement
WF 1 : which has some 15 R/3 TABLES.
For each table we have created a separate Dataflow.
And finally in between in some dataflows, wherein, the SAP tables which had lot of rows, i decided to pull it directly,
by-passing the ABAP flow.
And still the entire work flow and data extraction happens ok.
In fact i tried creating a new sample data flow and tested.
Using direct download and - and also execute preloaded.
I did not see any major difference in time taken for data extraction;
Because anyhow we pull the entire Table, then choose whatever we want to bring into oracle thro a view for our BO reporting or aggregate and then bring data as a table for Universe consumption.
Actually, I was looking at other options to avoid this ABAP generation - and the R/3 data flow because we are having problems on our dev and qa environments - giving delimiter errors. Whereas in production it works fine. Production environment is a old set up of BODS 3.1. QA and Dev are relatively new enviornments of BODS. Which is having this delimiter error.
I did not understand how to resolve it as per this post : https://cw.sdn.sap.com/cw/ideas/2596
And trying to resolve this problem, I ended up with the option of trying to pull directly the R/3 table. Without using ABAP workflow. Just by trial and error of each and every drag and drop option. Because we had to urgently do a POC and deliver the data for the entire e recruiting module of SAP.
I dont know whether i could do this direct pulling of data - for the new job which i have created,
which has 2 workflows with 15 Dataflows in each worflow.
And and push this job into production.
And also whether i could by-pass this ABAP flow and do a direct pulling of R/3 data, in all the Dataflows in the future for ANY of our SAP R/3 data extraction requirement. And this technical understanding is not clear to us as regards the difference between the 2 flows. And being new to this whole of ETL - I just wanted to know the pros and cons of this particular data extraction.
As advised I shall check the schedules for a week, and then we shall move it probably into production.
Thanks again.
Kind Regards
Indu
Edited by: Indumathy Narayanan on Aug 22, 2011 7:02 PM -
HI All,
I have gone thorugh the sdn link for FI extarction and founf out to be very useful.
Still i have some doubts....
For line item data extraction...........Do we need to etract data from 0FI_GL_4, 0FI_AP_4, 0FI_AR_4, into ODS0FIGL_O02(General Ledger: Line Items) ? If so do we need to maintain transformation between ods and all three DS?
ALso Please educate me on 0FI_AP_3 and 0FI_AP_4 data sources.......>
Jacob Jansen wrote:
> Hi Raj.
>
> Yes, you should run GL_4 first. If not, AP_4 and AR_4 will be lagging behind. You can see in R/3 in table BWOM_TIMEST how the deltas are "behaving" with respect to the date selection.
>
> br
> jacob
Not necessarily for systems above plug in 2002.
As of Plug-In 2002.2, it is no longer necessary to have DataSources linked. This means that you can now load 0FI_GL_4, 0FI_AR_4, 0FI_AP_4, and 0FI_TX_4 in any order. You also have the option of using DataSources 0FI_AR_4, 0FI_AP_4 and 0FI_TX_4 separately without 0FI_GL_4. The DataSources can then be used independently of one another (see note 551044).
Source:http://help.sap.com/saphelp_nw04s/helpdata/EN/af/16533bbb15b762e10000000a114084/frameset.htm -
Creating a zipped file from table data
I am currently extracting data from a table and creating a flat file which is read later to create a compressed (zip) file.
Is there a way that I can create a zipped file in one step (i.e. directly from the table data)? I am trying to reduce the number of file i/o hits in my appication.
Thanks,
MineshIs there a way that I can create a zipped file in one step (i.e. directly from the table data)?Yes. Instead of doing something like:
FileOutputStream fout = new FileOutputStream("flatfile");
// write using foutdo this:
ZipOutputStream zout = new ZipOutputStream("flatfile.zip");
zout.putNextEntry(new ZipEntry("flatfile"));
// write using zout
Maybe you are looking for
-
Delete file from iTunes Library and Computer
I convert a lot of my own TV shows for playback on iPad/iPod touch. Once I have watched the show I delete it from my library. It used to be that when I deleted from my library, I would get a message asking if I wanted to delete from library only or f
-
How do I set up a second iPod Touch account on my computer?
How do I set up a second iPod Touch account on my computer? Am having trouble.
-
Send via e-mail an interactive PDF with filled fields and signature
Hallo all, I have this problem: i create a PDF DOC with some fields that can be filled (like name, address, and so on). Well, when i have finished filling the fields and I have also signed the DOC ( very good feature) I want to save or almost Send it
-
Change properties of a cluster element withing an array of clusters
Hello all, I have an array of cluster that is shaped as a line with different display elements. A list or a tree wouldn't have made it, so I had to use a cluster and make a table. The problem is that I want to change not only the text but also the te
-
Hi, I tried integrating java code with java embedding acitivity, but in em it is failing. Can anybody know how to convert the below java code to a webservice, so that I can pass the input paramaters directly to that webservice. I tried of converting