Regarding loading data
Hi All,
As a part of first phase we have done LO extraction for all standard cubes. Now we have created custom cubes which are getting data from standard data sources like 2lis_02_itm, 2lis_02_hdr, etc. As the deltas have already being extracted by job controls, how to proceed further so that I can do Full, Init and Deltas for the new InfoCubes????
Please suggest me in this regard.
Thanks
Jay
Suppose a cube is already loading a first init and then is loading delta loads..everyday...since a long time..
Now if u do a full load to the same cube..it will receive not only init data but also data from all the deltas loaded in the past..
Basically..when u do init..all records from source system table are sent to BW and delta is initialized..now for the delta load next day..whatever records get added/changed in the source table..get pushed to the delta queue..
Now if u do a full load 1 year later..system does not need to look at delta queue..in the full load it simply pushes all records of the source table to BW..
Do full load to the new infocubes only(not for std ones,which are getting delta already)..new infocubes will surely receive all data.
Then remove init from infopackage and then do 'init without data transfer' to all data targets(old and new)..so that deltas loads can go on as normal.
cheers,
Vishvesh
Similar Messages
-
Regarding: Loading data from R/3 To BI for a Generic Data source
Hi Every,
Need Help Urgent
I had created a Generic data source with function Module as the data source, in Rsa3 it is working fine.
1-> I had replicated the data source to Bi then i had created a info package and then I executed the same. when it is getting the data and it is show in the request monitor (Number of records ) but the status is not changing from Yellow to Green.
Status in Step by step analysis is every step is green Except "Data selection successfully finished ?" (RED)
2-> Then I had seen the Back ground job in Source system which is executing still. I waited for it for a long time but nearly 30 min.
(I had done Steps one and 2 number of time by activating replicating the data source and so on, but still their is no change in it)
3-> Then I had canceled that back ground job with the help of BASIS (as i feel that it is something going wrong).
4-> I feel that their is some thing wrong in the Code of Extractor.
Please Help...............
""Local Interface:
*" IMPORTING
*" VALUE(I_REQUNR) TYPE SRSC_S_IF_SIMPLE-REQUNR
*" VALUE(I_DSOURCE) TYPE SRSC_S_IF_SIMPLE-DSOURCE OPTIONAL
*" VALUE(I_MAXSIZE) TYPE SRSC_S_IF_SIMPLE-MAXSIZE OPTIONAL
*" VALUE(I_INITFLAG) TYPE SRSC_S_IF_SIMPLE-INITFLAG OPTIONAL
*" VALUE(I_READ_ONLY) TYPE SRSC_S_IF_SIMPLE-READONLY OPTIONAL
*" VALUE(I_REMOTE_CALL) TYPE SBIWA_FLAG DEFAULT SBIWA_C_FLAG_OFF
*" TABLES
*" I_T_SELECT TYPE SRSC_S_IF_SIMPLE-T_SELECT OPTIONAL
*" I_T_FIELDS TYPE SRSC_S_IF_SIMPLE-T_FIELDS OPTIONAL
*" E_T_DATA STRUCTURE ZBI_MATGRIR OPTIONAL
*Need to get the data only for two Gl account which are fro material purchase while MIGO
*G/L Account Numbers: 0010502001 0010502002
data: E_T_DATA1 type table of ZBI_MATGRIR.
RANGES: R_BUKRS FOR BSIS-BUKRS,
R_BUDAT FOR BSIS-BUDAT,
R_GJAHR FOR BSIS-GJAHR,
R_HKONT FOR BSIS-HKONT.
DATA: L_S_SELECT TYPE SRSC_S_SELECT.
STATICS: S_S_IF TYPE SRSC_S_IF_SIMPLE,
S_COUNTER_DATAPAKID LIKE SY-TABIX,
S_CURSOR TYPE CURSOR.
*Declare
TYPES: BEGIN OF TY_FAGL,
RBURS TYPE FAGLFLEXA-RBUKRS,
RYEAR TYPE FAGLFLEXA-RYEAR,
DOCNR TYPE FAGLFLEXA-DOCNR,
BUZEI TYPE FAGLFLEXA-BUZEI,
DOCLN TYPE FAGLFLEXA-DOCLN,
PRCTR TYPE FAGLFLEXA-PRCTR,
SEGMENT TYPE FAGLFLEXA-SEGMENT,
END OF TY_FAGL.
DATA: GT_FAGL TYPE TABLE OF TY_FAGL,
GS_FAGL TYPE TY_FAGL.
IF I_INITFLAG = SBIWA_C_FLAG_ON.
CASE I_DSOURCE.
WHEN 'ZFI_GL_M4'.
WHEN OTHERS.
IF 1 = 2. MESSAGE E009(R3). ENDIF.
this is a typical log call. Please write every error message like this
LOG_WRITE 'E' "message type
'R3' "message class
'009' "message number
I_DSOURCE "message variable 1
' '. "message variable 2
RAISE ERROR_PASSED_TO_MESS_HANDLER.
ENDCASE.
APPEND LINES OF I_T_SELECT TO S_S_IF-T_SELECT.
Fill parameter buffer for data extraction calls
S_S_IF-REQUNR = I_REQUNR.
S_S_IF-DSOURCE = I_DSOURCE.
S_S_IF-MAXSIZE = I_MAXSIZE.
APPEND LINES OF I_T_FIELDS TO S_S_IF-T_FIELDS.
ELSE.
Data transfer: First Call OPEN CURSOR + FETCH
Following Calls FETCH only
First data package -> OPEN CURSOR
IF S_COUNTER_DATAPAKID = 0.
Fill range tables BW will only pass down simple selection criteria
of the type SIGN = 'I' and OPTION = 'EQ' or OPTION = 'BT'.
LOOP AT S_S_IF-T_SELECT INTO L_S_SELECT WHERE FIELDNM = 'BUKRS'.
MOVE-CORRESPONDING L_S_SELECT TO R_BUKRS.
APPEND R_BUKRS.
ENDLOOP.
LOOP AT S_S_IF-T_SELECT INTO L_S_SELECT WHERE FIELDNM = 'GJAHR'.
MOVE-CORRESPONDING L_S_SELECT TO R_GJAHR.
APPEND R_GJAHR.
ENDLOOP.
LOOP AT S_S_IF-T_SELECT INTO L_S_SELECT WHERE FIELDNM = 'BUDAT'.
MOVE-CORRESPONDING L_S_SELECT TO R_BUDAT.
APPEND R_BUDAT.
ENDLOOP.
*GRIR Inventory (RM/Stores/Spares/FG) 10502001
*GRIR Services & Others Payable 10502002
R_HKONT-SIGN = 'I'. "i_t_select-sign.
R_HKONT-OPTION = 'BT'." i_t_select-option.
R_HKONT-LOW = '0010502001'.
R_HKONT-HIGH = '0010502002'. "i_t_select-high.
APPEND R_HKONT.
Determine number of database records to be read per FETCH statement
from input parameter I_MAXSIZE. If there is a one to one relation
between DataSource table lines and database entries, this is trivial.
In other cases, it may be impossible and some estimated value has to
be determined.
OPEN CURSOR WITH HOLD S_CURSOR FOR
SELECT BUKRS
AUGBL
ZUONR
BELNR
GJAHR
BUZEI
BUDAT
HKONT
BLART
MONAT
BSCHL
SHKZG
DMBTR
WAERS
FROM BSIS
INTO TABLE E_T_DATA
WHERE BUKRS IN R_BUKRS
AND GJAHR IN R_GJAHR
AND BUDAT IN R_BUDAT
AND HKONT IN R_HKONT.
Fetch records into interface table.
named E_T_'Name of extract structure'.
FETCH NEXT CURSOR S_CURSOR
APPENDING CORRESPONDING FIELDS
OF TABLE E_T_DATA1
PACKAGE SIZE S_S_IF-MAXSIZE.
IF SY-SUBRC <> 0.
CLOSE CURSOR S_CURSOR.
RAISE NO_MORE_DATA.
ENDIF.
DELETE E_T_DATA WHERE BLART NE 'WE'.
SELECT BUKRS
AUGBL
ZUONR
BELNR
GJAHR
BUZEI
BUDAT
HKONT
BLART
MONAT
BSCHL
SHKZG
DMBTR
WAERS
FROM BSAS
into table E_T_DATA
WHERE BUKRS IN R_BUKRS
AND GJAHR IN R_GJAHR
AND BUDAT IN R_BUDAT
AND HKONT IN R_HKONT.
FETCH NEXT CURSOR S_CURSOR
APPENDING CORRESPONDING FIELDS
OF TABLE E_T_DATA
PACKAGE SIZE S_S_IF-MAXSIZE.
append LINES OF e_t_data1 TO E_T_DATA.
DELETE E_T_DATA WHERE BLART NE 'WE'.
ENDIF. "First data package ?
DATA: F_YEAR TYPE BKPF-GJAHR.
DATA: F_PERI TYPE BAPI0002_4-FISCAL_PERIOD.
IF E_T_DATA[] IS NOT INITIAL.
SELECT RBUKRS
RYEAR
DOCNR
BUZEI
DOCLN
PRCTR
SEGMENT
FROM FAGLFLEXA
INTO TABLE GT_FAGL
FOR ALL ENTRIES IN E_T_DATA
WHERE RYEAR = E_T_DATA-GJAHR
AND DOCNR = E_T_DATA-BELNR
AND RLDNR = '0L'
AND RBUKRS = E_T_DATA-BUKRS
AND BUZEI = E_T_DATA-BUZEI.
WHERE RYEAR = E_T_DATA-GJAHR
AND DOCNR = E_T_DATA-BELNR
AND RBUKRS = E_T_DATA-BUKRS.
AND DOCLN = E_T_DATA-BUZEI.
ENDIF.
LOOP AT E_T_DATA.
IF E_T_DATA-SHKZG = 'H'.
E_T_DATA-DMBTR = E_T_DATA-DMBTR * -1.
ENDIF.
CLEAR: F_YEAR.
CALL FUNCTION 'BAPI_COMPANYCODE_GET_PERIOD'
EXPORTING
COMPANYCODEID = E_T_DATA-BUKRS
POSTING_DATE = E_T_DATA-BUDAT
IMPORTING
FISCAL_YEAR = F_YEAR
FISCAL_PERIOD = F_PERI.
DATA: V_DOC(6) TYPE C .
CLEAR: V_DOC.
V_DOC = E_T_DATA-BUZEI.
IF V_DOC IS NOT INITIAL.
CALL FUNCTION 'CONVERSION_EXIT_ALPHA_INPUT'
EXPORTING
INPUT = V_DOC
IMPORTING
OUTPUT = V_DOC.
ENDIF.
aS PROFIT center is not updated in all the lines in Bsis
READ TABLE GT_FAGL INTO GS_FAGL WITH KEY RYEAR = E_T_DATA-GJAHR
DOCNR = E_T_DATA-BELNR
RBURS = E_T_DATA-BUKRS
BUZEI = E_T_DATA-BUZEI.
IF SY-SUBRC = 0.
E_T_DATA-PRCTR = GS_FAGL-PRCTR.
E_T_DATA-SEGMENT = GS_FAGL-SEGMENT.
ENDIF.
*As we are using the amount DMBTR in which the amount
*will be in company code currency that is Local currency
*group currency always in the main company code currency.
CONCATENATE F_YEAR '0' F_PERI INTO E_T_DATA-FISCPER.
MODIFY E_T_DATA. " from gs_bsis transporting dmbtr fiscper.
CLEAR: E_T_DATA.
ENDLOOP.
S_COUNTER_DATAPAKID = S_COUNTER_DATAPAKID + 1.
ENDIF.Hi,
Please check the log of same jobs for last week and check is today its taking more time,also check with basis is their any backup initited at same time.
Moreover until the background job failed automatically its difficult to imagine what is the exact issue.
Thanks, -
Regarding Load data from one cube to other cube in BW 3.X
Hello Experts,
Here is my scenario. I have one cube say c1 from which i need to transfer data into the cube say c2 which has material code <> blank and sales quantity is 0 . We are using BW 3.x.
Now how do i do that.
Kindly Help Urgently.you have to write the below code in the start routine of your update rule.
DELETE DATA_PACKAGE WHERE MATERIAL IS INITIAL AND SALES_QUANTITY NE 0.
This code will delete the records where material is blank and sales quantity not equal to zero.
So at the end of the start routine, you will get only those records with material not blank and sales quanity equal to 0.
--- Thanks... -
Loading data from flat file...
Hello,
I am actually experiencing a problem where I cant load data from a flat file into a table. I have actually reverse engineered the flat file into ODI. But the thing is I don't know how to load the data that is reversed into a RDBMS table. Added, I don't know how create this RDBMS table form within ODI to be reflected on the DB. Added, don't know how to load the data from the flat file onto this table without having to add the columns in order to map between the flat file and the table.
In conclusion, I need to know how to create an RDBMS table from within ODI on the database. And how to automatically map the flat file to the DB table and load the data into the DB table.
Regards,
HossamHi Hossam,
We can used ODI procedure to create table in the DB.
Make sure you keep the column name in the table name same as the column name in FLAT FILE so that it can automatically map the column.
and regarding Loading data from FLAT File i.e. our source table is FLAT FILE till ODI 10.1.3.4 we need to manually insert the datastore since the file system cannot be reversed.
Please let me know Hossam if i can assis you further.
Thanks and Regards,
Andy -
I am working on our very first ESSBASE database, and we are currently loading data for the previous month. Our measures dimension has around 18 children right now.When we loaded the data, we noticed that the numbers were a bit off, even though the load was 'successful'.When we tried to load June data one measure at a time, the numbers for some measures were corrected. Still, some of the measures were off.Does anyone know of any limitations regarding loading data thru SQL?Thank you for your help!
'Successful' only means that the SQL source was accessed without generating any SQL errors. You may still have had dataload errors (rows which Essbase was unable to load into the database).Check the dataload error file (default location in the %Arborpath%\client directory)David
-
Regarding MM Data Loads.
Hi All,
I am loading data for MM Cubes, but found that many data sources like 2LIS-02_CGR, 2LIS-02_SCN, 2LIS-02_SGR are blank and not having any data. I am doubting whether I forgot something to be done in R/3 to get the data. I have populated the setup tables also.
Is there any sequence of data loading through these data sources? As few data sources like 2LIS_02_SCL the data transferred is in Lakhs of records but added 0 records.
Please help me. I will assign points.
Thanks and Regards,
Sangini.Hi Sangini,
Refer to following link
http://help.sap.com/saphelp_nw04/helpdata/en/8d/bc383fe58d5900e10000000a114084/frameset.htm
It refers to SAP std sequence of loading data through process chain .
Hope it helps.
Regards
Mr Kapadia -
Regarding master data loading for different source systems
Hi Friends,
I have an issue regarding master data loading.
we have two source systems one is 4.6c and another is ecc 6.0.
First i am loading the master data from 4.6c to bi7.0.
Now this 4.6c is upgraded to ecc6.0.
*In 4.6c and ecc6.0c master data is changing.
After some time there is no 4.6c, only ecc 6.0 is there.
Now if i load master data from ecc6.0 to bi7.0 what will happen.
Is it possible ?
Could you please tell me?
Regards,
ramnaresh.Hi ramnaresh porana,
Yes, its possible. You can load data from ECC.
Data will not change, may be you may get more fields in datasource at r/3 side, but BW/BI side no change in mappings structures are same. So data also same.
You need to take care of Delta's before and after upgrade.
Hope it Helps
Srini -
Regarding Loading the data from DSO to cube.
Hello Experts,
I have DSO which loads data from psa using 7.0 tranformation (using DTP). And i have a cube which loads the data from that DSO using 3.x transfer rules. Now i have deleted the init request for a infopack before load the data into DSO. But when i load the data from DSO to cube by right click on DSO -> click On Additional Functions -> update the 3.x data to targets, It is giving me an error like 'Delete init. request REQU_4H7UY4ZXAO72WR4GTUIW0XUKP before running init. again with same selection
Please help me with this.
i want to load the data in the init request to cube..
ThanksHi Shanthi,
Thanks For reply. I have already deleted the init request from source system to DSO and then i have tried still i am getting the error.
Thanks -
Regarding Short Dump While loading data from DB Connect
Dear All,
We are having an issue of getting short dump while loading data from DB Connect to BW. We were able to load the data into BW Dev using the same data source without any problem. Whereas in Production, I am getting the following error:
Runtime Error PERFORM_CONFLICT_TAB_TYPE
Except. CX_SY_DYN_CALL_ILLEGAL_TYPE
What could be the reason for the error that I am gettinghi,
Refer Note 707986 - Writing in trans. InfoCubes: PERFORM_CONFLICT_TAB_TYPE
Summary
Symptom
When data is written to a transactional InfoCube, the termination PERFORM_CONFLICT_TAB_TYPE occurs. The short dump lists the following reasons for the termination:
("X") The row types of the two tables are incompatible.
("X") The table keys of the two tables do not correspond.
Other terms
transactional InfoCube, SEM, BPS, BPS0, APO
Reason and Prerequisites
The error is caused by an intensified type check in the ABAP runtime environment.
Solution
Workaround for BW 3.0B (SP16-19), BW 3.1 (SP10-13)
Apply the attached correction instructions.
BW 3.0B
Import Support Package 20 for 3.0B (BW3.0B Patch20 or SAPKW30B20) into your BW system. The Support Package is available oncenote 0647752 with the short text "SAPBWNews BW3.0B Support Package 20", which describes this Support Package in more detail, has been released for customers.
BW 3.10 Content
Import Support Package 14 for 3.10 (BW3. 10 Patch14 or SAPKW31014) into your BW system. The Support Package is available once note 0601051 with the short text "SAPBWNews BW 3.1 Content Support Package 14" has been released for customers.
BW3.50
Import Support Package 03 for 3.5 (BW3.50 Patch03 or SAPKW35003) into your BW system. The Support Package is available once note 0693363 with the short text "SAPBWNews BW 3.5 Support Package 03", which describes this Support Package in more detail, has been released for customers.
The notes specified may already be available to provide advance information before the Support Package is released. However, in this case, the short text still contains the term "Preliminary version" in this case.
Header Data
Release Status: Released for Customer
Released on: 18.02.2004 08:11:39
Priority: Correction with medium priority
Category: Program error
Primary Component: BW-BEX-OT-DBIF Interface to Database
Secondary Components: FIN-SEM-BPS Business Planning and Simulation
Releases
Software
Component Release From
Release To
Release And
subsequent
SAP_BW 30 30B 30B
SAP_BW 310 310 310
SAP_BW 35 350 350
Support Packages
Support
Packages Release Package
Name
SAP_BW_VIRTUAL_COMP 30B SAPK-30B20INVCBWTECH
Related Notes
693363 - SAPBWNews BW SP03 NW'04 Stack 03 RIN
647752 - SAPBWNews BW 3.0B Support Package 20
601051 - SAPBWNews BW 3.1 Content Support Package 14
Corrections Instructions
Correction
Instruction Valid
from Valid
to Software
Component Ref.
Correction Last
Modifcation
301776 30B 350 SAP_BW J19K013852 18.02.2004 08:03:33
Attributes
Attribute Value
weitere Komponenten 0000031199
Thanks
(Activate ODS/Cube and Transfer rules again..) -
Regarding loading of excel data into oracle database
hello,
Can someone help me in knowing that how can we load excel sheets data into oracle database.
I will be really thankful to you.
GursimranHi,
There is tool given by oracle "Oracle Bulk Loader "
you can use this.
But in order use this you need a control file, which specifies how data should be loaded into the database; and a data file, which specifies what data should be loaded.
Example :- this is the control file which has information how the data is processed.
LOAD DATA
INFILE test.dat
INTO TABLE test
FIELDS TERMINATED BY '|'
(i, s)
Example :- test.dat which is the data file or say your excel sheet or document
(1, 'foo')
(2, 'bar')
(3, ' baz')
example of Loading :-
sqlldr <yourName> control=<ctlFile> log=<logFile> bad=<badFile>
sqlldr testing control=test.ctl log=test.log
Thanks
Pavan Kumar N -
Regarding the Status of Loaded Data in Report.
Hi All,
My client wants to view the status of the data that is being represented in the Reports. For example we are having data till today in R/3 but we have only loaded data till yesterday then he wants it to be represented in the report. Like Data as on May 8th . Is there any script that can be written to get this information dynamically in the report header???hi Yj,
in web you can use web item 'text element' and choose element type 'general text' and element id ROLLUPTIME.
<object>
<param name="OWNER" value="SAP_BW">
<param name="CMD" value="GET_ITEM">
<param name="NAME" value="TextElements_1">
<param name="ITEM_CLASS" value="CL_RSR_WWW_ITEM_TEXT_ELEMENTS">
<param name="DATA_PROVIDER" value="Data_Provider">
<param name="GENERATE_CAPTION" value="">
<param name="GENERATE_LINKS" value="">
<param name="SHOW_FILTERS" value="">
<param name="SHOW_VARIABLES" value="">
<param name="ELEMENT_TYPE_1" value="COMMON">
<param name="ELEMENT_NAME_1" value="ROLLUPTIME">
<param name="ONLY_VALUES" value="X">
ITEM: TextElements_1
</object>
in bex analyzer, you can display the date with menu business explorer -> display text elements -> general, you will see some info, 'author'... 'last refreshed'.
hope this helps. -
Regarding short dump while loading data
Hi All,
I am loading data using 0FI_GL_4. This is a periodic load which was working fine till March. There are lot of postings that happened in April and period was open till 20th. Now when I am trying to load that for 12th period April. I am getting the following error "ABAP/4 processor: DBIF_RSQL_SQL_ERROR" in short dump.
This is the error analysis:
An exception occurred. This exception is dealt with in more detail below
. The exception, which is assigned to the class 'CX_SY_OPEN_SQL_DB', was
neither
caught nor passed along using a RAISING clause, in the procedure
"BWFIR_READ_BSEG_CPUDT_DATA" "(FUNCTION)"
I am tried to load data 5 times till now and every time I am getting the same error. I checked with Basis people for the DB space and it is fine.
Why I am suddenly getting this error???Hi,
it looks like an internal table overflow in your source system.
Try reducing the data_package size of your load or ask your basis guys to have a look on short dump and perhaps increase this memory.
Try RSA3 in the source system and see if it works with your data_package size values from BW and see if it works. If not reduce the number of records...
On which plugin release is your system running?
hope this helps,
Oliviier. -
Regarding BW Data Load Errors .. plz help me BW Experts
hi bw experts,
when the loads are coming from R/3 to BW System. if it gets failed we will be getting some errors like
error -1 - related to file opening
error - 4 - related to data problems
can u plz tell me where all these error no's and their completed description is stored or can be found.
will they reside in a table/ any tcode
plz help me..
vijay..Hi,
What exactly happening while loading data from R/3 to BW.
Can you clear about this.
R u getting any Short dump, then check in t code:ST22
also check the job log monitor.
Let us know more details..
Reg
Pra -
Input ready query is not showing loaded data in the cube
Dear Experts,
In Input ready query we have problem that it is not showing the values which was not entered by throught hat query. Is any settings in input ready query that we can do to populate the loaded data on the cube as well as data entered through Input ready query itself.
Thanks,
Gopi RHi,
input ready queries always should display most recent data (i.e. all green and the yellow request). So you can check the status of the requests in the real-time InfoCube. There should exist only green requests and maybe at most one yellow request.
In addition you can try to delete the OLAP cache for the plan buffer query: Use RSRCACHE to do this. The technical names of the plan buffer query can be found as follows:
1. InfoCube\!!1InfoCube, e.g. ZTSC0T003/!!1ZTSC0T003 if ZTSC0T003 is the technical name of the InfoCube
2. MPRO\!!1MPRO, e.g. ZTSC0M002/!!1ZTSC0M002 if ZTSC0M002 is the technical name of the multiprovider
If the input ready query is defined on an aggregation level using a real-time InfoCube, the first case is relevant; if the aggregation level is defined on a multiprovider the second case is relevant. If the input-ready query is defined on a multiprovider containing aggregation levels again the first case is relevant (find the real-time InfoCubes used in the aggregation level).
Regards,
Gregor -
Can we load data in chunks using data pump ?
We are loading data using data pump. So I want to clear my understanding.
Please correct me if I am wrong on my understandings -
ODI will fetch all data from source (whether it is INIT or CDC ) in one go and unload into staging area.
If it is true, will performance hamper in case very huge data (50 million records at source) at source as ODI tries to load entire data in one go. I believe it will give better performance if we load in chunks using data pump.
Please confirm and correct.
Also I would like to know how can we configure chunk load using data-pump.
Thanks in Advance.
Regards,
Dinesh.You may consider usingLKM Oracle to Oracle (datapump)
http://docs.oracle.com/cd/E28280_01/integrate.1111/e12644/oracle_db.htm#r15c1-t2
In 11g ODI reads from source and write to target in parallel. This is the case where you specify select query in source command and insert/update query in the target command. At source side Odi reads records from source and add them to a data queue. At target side a parallel thread reads data from the data queue and writes to the target. So the overall performance would be the slower of the read or write process.
Thanks,
Maybe you are looking for
-
All users' home directories are missing
My Girlfriend, nntnroses (laura) logged into her computer today with an error message, "user's home direcotry cannot be found if it is located on a network drive, please recconnect". Her roomates home directory were also gone. Netinfo manager was sti
-
Converting colors to monochrome black (not to grayscale) Pro X
Hello everyone, I'm trying to convert the colors in a PDF to monochrome black (black, black, and only black, not grayscale wanted!!!), but I only find how to convert in grayscale... I'm using Pro X and I don't have and can't have any other software a
-
Incoming invoice post via INVOIC01 (IDOC_INPUT_INVOIC_MRM)
Dear all, We are trying to post an incoming supplier invoice against a PO via IDOC INVOIC01 (IDOC_INPUT_INVOIC_MRM). We were able to identify all technically required fields to post the invoice. However, we have some difficulties: 1). How can we enri
-
Certificate based authentication with Anyconnect
Dears, i successfully configured ASA to be used as VPN gateway with anyconnect using certificate in authentication , my issue my be not realted to Cisco directly. i am using CA server installed on Windows 2008 R2 , when i checked the issued
-
Correlation set based on correlation from raised ADF event payload
Hi, In a BPEL process, how can we include a Correlation set based on correlation from raised ADF event payload ? Thanks