Security issues when using the file adapter
I want to use the file adapter, and in the documentation it states that you can use "directories" and "ftp".
I want to transfer a file secure and both of these options are not really secure.
Directory reading is not really secure when you read files from a windows share (SMB protocol).
Is it possible to make a share more secure in a windows environment?
In an ftp session everybody can steal the password. Is there a possibility to use "sftp"? Or will this be available?
Currently we dont have, but we will have it soon.
Similar Messages
-
Cannot process a Fixed Field Length file using the File Adapter (Sender)
Hi -
I have checked throughout these posts and blogs but I still have not been able to find a solution to my issue. When using the File Adapter (Sender) I get a Conversion initialization failed with "xml.keyfieldName", no value found. Why would I require a key field when I am using fixed field lenghts? The file is comprised of 2 structures - 1 header and multiple details (see below). There are no key fields in the flat file that I would be able to use. Any suggestions?
011000390 Customer Americas 20080605164317 000000000000000800000008000000000016000000
12345678 100500 100500 Supplier 1 0000000000030000002008040400
12345678 100501 100501 Supplier 2 0000000000052000002008042100
The File Adapter is configured as follows:
Document Name = Rfchke00
Document Namespace = 'my namespace'
Recordset Name = Rfchke00
Recordset Structure = Dtachkh,1,Dtachkp,*
Recordset Sequence = Ascending
Recordsets per Message = 1
Key Field Type = String (Case-Sensitive)
Dtachkh.fieldFixedLengths = 15,25,8,6,1,8,8,8,15,3,31
Dtachkh.fieldNames = F1,F2,F3,F4,F5,F6,F7,F8,F9,F10,F11
Dtachkh.processFieldNames = fromConfiguration
Dtachkp,fieldFixedLengths = 18,13,13,35,15,3,8,2,21
Dtachkp,fieldNames = F1,F2,F3,F4,F5,F6,F7,F8,F9
Dtachkp,processFieldNames = fromConfiguration
Thanks,
DaveHi,
you can use the module from which u can convert your structure to
H011000390 Customer Americas 20080605164317 000000000000000800000008000000000016000000
D12345678 100500 100500 Supplier 1 0000000000030000002008040400
D12345678 100501 100501 Supplier 2 0000000000052000002008042100
Please note the extra H,D in the struture added by the module.
You can then use them as your key fieldValues.. The module should be deployed in Visual Admin and then can be used in the Module tab of your adapter CC
While writing the content conversion after that please dont forget about the added new characters
Please note also that i can find that the word supplier kept repeating in all the Dtachkp records
Please use that
Also if you feel that the field is of 13 characters and that would cause a problem dont worry... create a dummy field eg split tht 13 to two fields and use the common one as key field Value and identifier... as i see in ure case its like 500 Supplier , 502 Supplier . u can split the first 4 char and the remaing 9 char are key field value.
try this out
Rgds
Aditya -
How to start a BPEL Process using the File Adapter
Hi
I would like to automatically start a BPEL Process when I store a file in a specific directory. Can this be done using the File Adapter?.
Regards,
Néstor BoscánYes, there are samples of how to do this in the BPEL samples directory.
-
PI needs to obtain a zip file via FTP using the File adapter
I have a scenario where PI needs to obtain a zip file via FTP using the File adapter, this zip file contains a number of txt files that I need to process, and the content of one of them send it to an ECC, now I'm using the PayloadZipBean Module in the Sender FIle Adapter, and I have two things if I use the Message Protocol as File, I get a Payload for each txt file in the zip file, but this payload has no structure, and if I use the File Content Conversion I get an XML strcuture with only one field and a strange string in it, and somewhere in this string the names of the files I assume all the content of the zip file, can anyone help on how could I achieve what I need that is to pull the zip file via SAP PI, then unzip it, and with the content of one of the txt files send it to an ECC via ABAP Proxy, thanks in advance for your answers.
Regards,
Raul AlvaradoHello Raul,
you can do it in futher way ...
pickup zip file and simply extract and dump it in another temp folder (can use scripts on OS level).
@ then Use another sender communication channel to pickup all these text file .
for further clarification you can use these links also. -
Process txt files in zip file
Accessing File using FTP from Java Mapping
File Sender Adapter with FTP protocol
BR
Raj -
Can I control filenaming when archiving files using the file adapter?
Hi folks,
Is there anyway to control the filename used when the File Adapter writes out to an archive?
Second question, I also need to be able to pass a "filename" to the adapter from an "input file." Is there a way to do this in the file adapter?
Sincerely,
lpacHi,
I have done that with the ftp adapter. In the .xsl file I wrote the following after the <xsl:stylesheet version="1.0" ....> tag:
<xsl:variable name="INFILENAME" select="ehdr:getRequestHeader('/fhdr:InboundFTPHeaderType/fhdr:fileName','fhdr=http://xmlns.oracle.com/pcbpel/adapter/ftp/;')"/>
<xsl:template match="/">
<xsl:variable name="OUTFILENAME"
select="ehdr:setOutboundHeader('/fhdr:OutboundFileHeaderType/fhdr:fileName', $INFILENAME, 'fhdr=http://xmlns.oracle.com/pcbpel/adapter/file/;')"/>
<opaque:opaqueElement>
<xsl:value-of select="/opaque:opaqueElement"/>
</opaque:opaqueElement>
</xsl:template>
</xsl:stylesheet>
To use this with the file adapter, you would have to wirte file where is written ftp.
Hope this helps,
Zaloa -
How to get the filename in mapping when using sender File adapter?
hi Experts,
I have scenario where XI reads the input file using Sender file adapter.
The file name is configured in the communication channel.
In my message mapping it is possible to read this file name?
Thanks
gopalHi Gpoal,
Use Dynamic Configuration - /people/michal.krawczyk2/blog/2005/11/10/xi-the-same-filename-from-a-sender-to-a-receiver-file-adapter--sp14
Regards,
Geetha -
Can I use the File Adapter to transfer files from/to my own PC
Hello,
Would it be possible to set tu the file adapter to send or receive files from my own PC's file system?
For example I would like to transfer a file from my PC to PI instead of using PI's file system?
Can I do that?
Thanks
MarceloIf your PC has Windows-based system, SAP XI is Unix-based, you have a problem: Unix can't connect to UNC-path - Unix doesn't understand path like this "
mywindowspc\myfilesharefolder\", and Windows can't mount unix-drives. There are has a tool that can mount Windows UNC-file-share on NTFS like dirive UnixFS - but this tool is not freeware and has a problem when you switch off your PC without unmounting your Windows-dirives on Unix system. You can use a SAMBA, but Admins doesn't like to install this to Unix...
So, if you want to use a File Adapter you can set only FTP path, create FTP-site on your own PC, and so on...
Another way to send information from/to your own PC is "Plain J2SE Adapter Engine". This is a standalone part of SAP XI, you install it on your PC, configures URL of XI pipline, sets local folders to transfer from/to your PC, masks and types of your files, time to sending, any rules and DOS-commands for "before/after sending and receiving" and so on...
In case of sending files from PC to XI, files transforms to XI-payload inside this Adapter Engine and then goes like XI-message to pipeline... When you send it back, payloads come to this Adapter Engine and transform to files on your PC.
I used this Plain J2SE AE on my working notebook for checking sending messages to many different XI Systems.
For more info: http://help.sap.com/saphelp_nw04/helpdata/en/6f/246b3de666930fe10000000a114084/content.htm
Regards. -
Noise on audio output when using the power adapter
I'm a musician and use my mid-2007 MBP 15" in my live rig. When I use the power adapter, I hear a buzz, sometimes accompanied by a slightly louder buzz that cycles on & off about 1x/second. There are a few weird things going on here:
1. The buzz does not happen all the time. That leads me to think it may be noise in the AC line that the adapter is plugged into.
2. The noise is heard when I use both the built-in audio output AND an external USB audio interface.
In any case, when I disconnect the power adapter and run my MBP on battery, the noise goes away completely. I can probably do this on most gigs, but I'm a little nervous since I run a FW drive, an external USB audio interface, and my midi keyboard (powered by USB) so I am putting a little heavier than usual load on the MBP's battery.
As I said, this may be a noisy AC line. Another guess is that it's the PMU inducing noise into the audio circuitry. I'm hoping a simple AC line filter can help me but I thought I would ask here first in case anyone else has dealt with this. TIA for any help!I had a similar problem and fixed it with a $5 universal travel adapter. Yours might be different but here's my situation. Try it for little cost and see if it works.
I'm also a muso with an iMac running external KRK speakers via USB Mbox2. Previously I had a Macbook Pro and the same audio problem. I've been putting up with a very low volume, high pitch noise until now. I simply plugged the Mac power cord into a universal travel adapter which then goes into the wall. For Aussie users there is an adapter on ebay that will allow a 3 pin Aussie plug to go into female side and has a two pin Aussie male on the other. For US users you can probably get the same although there is a 59c product that will do the trick called a ground lift adapter. Applecare actually agreed this was the best fix and confirmed there would be no problems with power supply or Mac performance. What a gem ! God knows I've looked into all kinds of alternatives that could have cost hundreds instead of $5. -
Which user is used to pick-up and archive files using the file adapter?
Hi guys,
which one it is? is it XIPADM?
thank you,
OlianHi! Olian,
You can get archive the file from FTP Server also or else even you can kept the folderrs in your local machine also....its your choice..
Suppose if you want to Keep and archive directory or else a folder of another copy at which you are sending files from XI you can follow the below steps...
1. While configuring the Sender File Adapter you can select the option of ARchive directory ....there in File Adapter Parameters..
2. Select that radio button and then create first one seperate folder in FTP location and then you can enter that directory path name in the Archive directory 3. Also if you want to delete the files from where the XI is picking the files from the source directory means select option called Delete.......
3. Then automatically it will be deleted once the file is picked and also it can maintain another copy file in the archive folder or directory...
I hope it can be useful to you.and this is sufficent to you...
Regards:
Amar Srinivas Eli -
Dynamic Destination based on data in message using the File adapter
I am unsure of where to start searching for a clue as to how to impement this. Any comments would be much appreciated.
Scenario
IDOC -
> XI -
> File adapter -
> File System
(DESADV) (Variable based on info in DESADV. ie Site)
Essentially I wish to use XI as a router and transformer for certain message types depending on the data within the message itself.
Is anyone aware of any documentation around this kind of scenario ?
Additional Notes
There is the possibilty of have up to 1000 different destinations from the same message, but the message will ONLY be sent to the site addressed within.
thanks in advance ..Hi Richard,
The scenario requirement is not yet very clear.
But if you want to route the Idoc to different receiver systems depending the payload value, you may configure it in ID with different business services and then using conditional Receiver Determination using XPath.
That is one way, and if you want to use the same receiver service and only 100 different target folders on the File System, then you can surely use the Variable Substitution for the Target Directory in NFS File Adapter. You can build the target path with from the payload value in variable substitution table under the advanced tab in File Adapter. Remember to set the "Create Target Directory" indicator under Target tab.
Hope one of these might be a solution for you. Let me know if you need more detailed information.
Regards,
Suddha -
Performance issue when using the same query in a different way
Hello,
I have a performance problem with the statement below when running it with an insert or with execute immediate.
n.b.: This statement could be more optimized, but it is a generated statement.
When I run this statement I get one row back within one second, so there is no performance problem.
select sysdate
,5
,'testje'
,count (1)
,'NL' groupby
from (select 'different (target)' compare_type
,t.id_org_addr id_org_addr -- ID_ORG_ADDR
,t.vpd_country vpd_country -- CTL_COUNTRY
,t.addr_type addr_type -- ADDRESSTYP_COD
from (select *
from (select t.*
from ods.ods_org_addr t
left outer join
m_sy_foreign_key m
on m.vpd_country = t.vpd_country
and m.key_type = 'ORGADDR2'
and m.target_value = t.id_org_addr
where coalesce (t.end_date, to_date ('99991231', 'yyyymmdd')) >= sysdate) /*SGRB*/
where vpd_country = 'NL' /*EGRB*/
) t
where exists
(select null
from (select *
from (select m.target_value id_org_addr
,s.wkp_id_cegedim || '-' || s.adr_id_cegedim || '-' || s.addresstyp_cod id_cegedim
,s.*
from okc_mdl_workplace_address s
left outer join
m_sy_foreign_key m
on m.vpd_country = s.ctl_country
and m.key_type = 'ORGADDR2'
and m.source_value = s.wkp_id_cegedim || '-' || s.adr_id_cegedim || '-' || s.addresstyp_cod
where coalesce (s.end_val_dat, to_date ('99991231', 'yyyymmdd')) >= sysdate) /*SGRB*/
where ctl_country = 'NL' /*EGRB*/
) s
where t.id_org_addr = s.id_org_addr)
minus
select 'different (target)' compare_type
,s.id_org_addr id_org_addr -- ID_ORG_ADDR
,s.ctl_country vpd_country -- CTL_COUNTRY
, (select to_number (l.target_value)
from okc_code_foreign l
where l.source_code_type = 'TYS'
and l.target_code_type = 'ADDRLINKTYPE'
and l.source_value = upper (s.addresstyp_cod)
and l.vpd_country = s.ctl_country)
addr_type -- ADDRESSTYP_COD
from (select *
from (select m.target_value id_org_addr
,s.wkp_id_cegedim || '-' || s.adr_id_cegedim || '-' || s.addresstyp_cod id_cegedim
,s.*
from okc_mdl_workplace_address s
left outer join
m_sy_foreign_key m
on m.vpd_country = s.ctl_country
and m.key_type = 'ORGADDR2'
and m.source_value = s.wkp_id_cegedim || '-' || s.adr_id_cegedim || '-' || s.addresstyp_cod
where coalesce (s.end_val_dat, to_date ('99991231', 'yyyymmdd')) >= sysdate) /*SGRB*/
where ctl_country = 'NL' /*EGRB*/
) s) When I run this statement using a insert by placing
insert into okc_compare_results (
datetime
,compare_tables_id
,compare_target
,record_count
,groupby
) before the statement, then the statement runs about *3 to 4 minutes*, The same is happening when running the select part only using execute immediate.
Below the execution plans of the insert with the select and the select only.
Could somebody tell me what causes the different behavior of the "same" statement and what could I do to avoid this behavior.
The database version is: 11.1.0.7.0
Regards,
Fred.
SQL Statement which produced this data:
select * from table(dbms_xplan.display_cursor ('cuk3uwnxx344q',0 /*3431532430 */))
union all
select * from table(dbms_xplan.display_cursor ('862aq599gfd6n',0/*3531428851 */))
plan_table_output
SQL_ID cuk3uwnxx344q, child number 0
select sysdate ,:"SYS_B_00" ,:"SYS_B_01"
,count (:"SYS_B_02") ,:"SYS_B_03" groupby from ( (select
:"SYS_B_04" compare_type ,t.id_org_addr id_org_addr
-- ID_ORG_ADDR ,t.vpd_country vpd_country --
CTL_COUNTRY ,t.addr_type addr_type -- ADDRESSTYP_COD
from (select * from (select t.*
from ods.ods_org_addr t
left outer join
m_sy_foreign_key m on
m.vpd_country = t.vpd_country ; and
m.key_type = :"SYS_B_05" and
m.target_value = t.id_org_addr ; where
coalesce (t.end_date, to_date (:"SYS_B_06", :"SYS_B_07")) >= sysdate)
/*SGRB*/ where vpd_country = :"SYS_B_08" /*EGRB*/
Plan hash value: 3431532430
Id Operation Name Rows Bytes Cost (%CPU) Time Pstart Pstop
0 SELECT STATEMENT 1772 (100)
1 SORT AGGREGATE 1
2 VIEW 3 1772 (1) 00:00:22
3 MINUS
4 SORT UNIQUE 3 492 1146 (1) 00:00:14
* 5 HASH JOIN OUTER 3 492 1145 (1) 00:00:14
6 NESTED LOOPS
7 NESTED LOOPS 3 408 675 (1) 00:00:09
* 8 HASH JOIN 42 4242 625 (1) 00:00:08
9 PARTITION LIST SINGLE 3375 148K 155 (2) 00:00:02 KEY KEY
* 10 TABLE ACCESS FULL OKC_MDL_WORKPLACE_ADDRESS 3375 148K 155 (2) 00:00:02 KEY KEY
* 11 INDEX RANGE SCAN PK_M_SY_FOREIGN_KEY 49537 2709K 469 (1) 00:00:06
* 12 INDEX UNIQUE SCAN UK_ODS_ORG_ADDR 1 1 (0) 00:00:01
* 13 TABLE ACCESS BY GLOBAL INDEX ROWID ODS_ORG_ADDR 1 35 2 (0) 00:00:01 ROWID ROWID
* 14 INDEX RANGE SCAN PK_M_SY_FOREIGN_KEY 49537 1354K 469 (1) 00:00:06
15 NESTED LOOPS
16 NESTED LOOPS 1 67 9 (12) 00:00:01
17 NESTED LOOPS 1 48 8 (13) 00:00:01
* 18 HASH JOIN 1 23 6 (17) 00:00:01
* 19 TABLE ACCESS BY GLOBAL INDEX ROWID ODS_COUNTRY_SYSTEM 1 11 2 (0) 00:00:01 ROWID ROWID
* 20 INDEX RANGE SCAN PK_ODS_DIVISION_SYSTEM 1 1 (0) 00:00:01
* 21 TABLE ACCESS FULL SY_SOURCE_CODE 8 96 3 (0) 00:00:01
22 TABLE ACCESS BY INDEX ROWID SY_FOREIGN_CODE 1 25 2 (0) 00:00:01
* 23 INDEX RANGE SCAN PK_SY_FOREIGN_CODE 1 1 (0) 00:00:01
* 24 INDEX UNIQUE SCAN PK_SY_TARGET_CODE 1 0 (0)
* 25 TABLE ACCESS BY INDEX ROWID SY_TARGET_CODE 1 19 1 (0) 00:00:01
26 SORT UNIQUE 3375 332K 626 (1) 00:00:08
* 27 HASH JOIN OUTER 3375 332K 625 (1) 00:00:08
28 PARTITION LIST SINGLE 3375 148K 155 (2) 00:00:02 KEY KEY
* 29 TABLE ACCESS FULL OKC_MDL_WORKPLACE_ADDRESS 3375 148K 155 (2) 00:00:02 KEY KEY
* 30 INDEX RANGE SCAN PK_M_SY_FOREIGN_KEY 49537 2709K 469 (1) 00:00:06
Predicate Information (identified by operation id):
5 - access("M"."TARGET_VALUE"="T"."ID_ORG_ADDR" AND "M"."VPD_COUNTRY"="T"."VPD_COUNTRY")
8 - access("M"."SOURCE_VALUE"="S"."WKP_ID_CEGEDIM" :SYS_B_12 S."ADR_ID_CEGEDIM" :SYS_B_13 S."ADDRESSTYP_COD" AND
"M"."VPD_COUNTRY"="S"."CTL_COUNTRY")
10 - filter(COALESCE("S"."END_VAL_DAT",TO_DATE(:SYS_B_14,:SYS_B_15))>=SYSDATE@!)
11 - access("M"."KEY_TYPE"=:SYS_B_11 AND "M"."VPD_COUNTRY"=:SYS_B_16)
12 - access("T"."ID_ORG_ADDR"="M"."TARGET_VALUE")
13 - filter(("T"."VPD_COUNTRY"=:SYS_B_08 AND COALESCE("T"."END_DATE",TO_DATE(:SYS_B_06,:SYS_B_07))>=SYSDATE@!))
14 - access("M"."KEY_TYPE"=:SYS_B_05 AND "M"."VPD_COUNTRY"=:SYS_B_08)
18 - access("CS"."ID_SYSTEM"="SK"."ID_SOURCE_SYSTEM")
19 - filter("CS"."SYSTEM_TYPE"=1)
20 - access("CS"."VPD_COUNTRY"=:B1 AND "CS"."EXP_IMP_TYPE"='I')
filter("CS"."EXP_IMP_TYPE"='I')
21 - filter("SK"."CODE_TYPE"=:SYS_B_18)
23 - access("FK"."ID_SOURCE_CODE"="SK"."ID_SOURCE_CODE" AND "FK"."SOURCE_VALUE"=UPPER(:B1) AND
"CS"."VPD_COUNTRY"="FK"."VPD_COUNTRY")
filter(("FK"."VPD_COUNTRY"=:B1 AND "FK"."SOURCE_VALUE"=UPPER(:B2) AND "CS"."VPD_COUNTRY"="FK"."VPD_COUNTRY"))
24 - access("FK"."ID_TARGET_CODE"="TK"."ID_TARGET_CODE")
25 - filter("TK"."CODE_TYPE"=:SYS_B_19)
27 - access("M"."SOURCE_VALUE"="S"."WKP_ID_CEGEDIM" :SYS_B_23 S."ADR_ID_CEGEDIM" :SYS_B_24 S."ADDRESSTYP_COD" AND
"M"."VPD_COUNTRY"="S"."CTL_COUNTRY")
29 - filter(COALESCE("S"."END_VAL_DAT",TO_DATE(:SYS_B_25,:SYS_B_26))>=SYSDATE@!)
30 - access("M"."KEY_TYPE"=:SYS_B_22 AND "M"."VPD_COUNTRY"=:SYS_B_27)
SQL_ID 862aq599gfd6n, child number 0
insert into okc_compare_results ( datetime
,compare_tables_id ,compare_target
,record_count ,groupby )
select sysdate ,:"SYS_B_00" ,:"SYS_B_01"
,count (:"SYS_B_02") ,:"SYS_B_03" groupby from ( (select
:"SYS_B_04" compare_type ,t.id_org_addr id_org_addr
-- ID_ORG_ADDR ,t.vpd_country vpd_country --
CTL_COUNTRY ,t.addr_type addr_type -- ADDRESSTYP_COD
from (select * from (select t.*
from ods.ods_org_addr t
left outer join
m_sy_foreign_key m on
m.vpd_country = t.vpd_country ; and
m.key_type = :"SYS_B_05" and
m.target_value = t.id_org_addr
Plan hash value: 3531428851
Id Operation Name Rows Bytes Cost (%CPU) Time Pstart Pstop
0 INSERT STATEMENT 1646 (100)
1 LOAD TABLE CONVENTIONAL
2 SORT AGGREGATE 1
3 VIEW 1 1646 (1) 00:00:20
4 MINUS
5 SORT UNIQUE 1 163
6 NESTED LOOPS OUTER 1 163 1067 (1) 00:00:13
7 NESTED LOOPS 1 135 599 (1) 00:00:08
* 8 HASH JOIN 19 1919 577 (2) 00:00:07
9 PARTITION LIST SINGLE 1535 69075 107 (4) 00:00:02 KEY KEY
* 10 TABLE ACCESS FULL OKC_MDL_WORKPLACE_ADDRESS 1535 69075 107 (4) 00:00:02 KEY KEY
* 11 INDEX RANGE SCAN PK_M_SY_FOREIGN_KEY 49537 2709K 469 (1) 00:00:06
* 12 TABLE ACCESS BY GLOBAL INDEX ROWID ODS_ORG_ADDR 1 34 2 (0) 00:00:01 ROWID ROWID
* 13 INDEX UNIQUE SCAN UK_ODS_ORG_ADDR 25 1 (0) 00:00:01
* 14 INDEX RANGE SCAN PK_M_SY_FOREIGN_KEY 1 28 468 (1) 00:00:06
15 NESTED LOOPS
16 NESTED LOOPS 1 67 8 (0) 00:00:01
17 NESTED LOOPS 1 48 7 (0) 00:00:01
18 NESTED LOOPS 1 23 5 (0) 00:00:01
* 19 TABLE ACCESS BY GLOBAL INDEX ROWID ODS_COUNTRY_SYSTEM 1 11 2 (0) 00:00:01 ROWID ROWID
* 20 INDEX RANGE SCAN PK_ODS_DIVISION_SYSTEM 1 1 (0) 00:00:01
* 21 TABLE ACCESS FULL SY_SOURCE_CODE 1 12 3 (0) 00:00:01
22 TABLE ACCESS BY INDEX ROWID SY_FOREIGN_CODE 1 25 2 (0) 00:00:01
* 23 INDEX RANGE SCAN PK_SY_FOREIGN_CODE 1 1 (0) 00:00:01
* 24 INDEX UNIQUE SCAN PK_SY_TARGET_CODE 1 0 (0)
* 25 TABLE ACCESS BY INDEX ROWID SY_TARGET_CODE 1 19 1 (0) 00:00:01
26 SORT UNIQUE 1535 151K
* 27 HASH JOIN OUTER 1535 151K 577 (2) 00:00:07
28 PARTITION LIST SINGLE 1535 69075 107 (4) 00:00:02 KEY KEY
* 29 TABLE ACCESS FULL OKC_MDL_WORKPLACE_ADDRESS 1535 69075 107 (4) 00:00:02 KEY KEY
* 30 INDEX RANGE SCAN PK_M_SY_FOREIGN_KEY 49537 2709K 469 (1) 00:00:06
Predicate Information (identified by operation id):
8 - access("M"."SOURCE_VALUE"="S"."WKP_ID_CEGEDIM" :SYS_B_12 S."ADR_ID_CEGEDIM" :SYS_B_13 S."ADDRESSTYP_COD" AND
"M"."VPD_COUNTRY"="S"."CTL_COUNTRY")
10 - filter(COALESCE("S"."END_VAL_DAT",TO_DATE(:SYS_B_14,:SYS_B_15))>=SYSDATE@!)
11 - access("M"."KEY_TYPE"=:SYS_B_11 AND "M"."VPD_COUNTRY"=:SYS_B_16)
12 - filter((COALESCE("T"."END_DATE",TO_DATE(:SYS_B_06,:SYS_B_07))>=SYSDATE@! AND "T"."VPD_COUNTRY"=:SYS_B_08))
13 - access("T"."ID_ORG_ADDR"="M"."TARGET_VALUE")
14 - access("M"."KEY_TYPE"=:SYS_B_05 AND "M"."VPD_COUNTRY"=:SYS_B_08 AND "M"."TARGET_VALUE"="T"."ID_ORG_ADDR")
filter("M"."TARGET_VALUE"="T"."ID_ORG_ADDR")
19 - filter("CS"."SYSTEM_TYPE"=1)
20 - access("CS"."VPD_COUNTRY"=:B1 AND "CS"."EXP_IMP_TYPE"='I')
filter("CS"."EXP_IMP_TYPE"='I')
21 - filter(("SK"."CODE_TYPE"=:SYS_B_18 AND "CS"."ID_SYSTEM"="SK"."ID_SOURCE_SYSTEM"))
23 - access("FK"."ID_SOURCE_CODE"="SK"."ID_SOURCE_CODE" AND "FK"."SOURCE_VALUE"=UPPER(:B1) AND
"CS"."VPD_COUNTRY"="FK"."VPD_COUNTRY")
filter(("FK"."VPD_COUNTRY"=:B1 AND "FK"."SOURCE_VALUE"=UPPER(:B2) AND "CS"."VPD_COUNTRY"="FK"."VPD_COUNTRY"))
24 - access("FK"."ID_TARGET_CODE"="TK"."ID_TARGET_CODE")
25 - filter("TK"."CODE_TYPE"=:SYS_B_19)
27 - access("M"."SOURCE_VALUE"="S"."WKP_ID_CEGEDIM" :SYS_B_23 S."ADR_ID_CEGEDIM" :SYS_B_24 S."ADDRESSTYP_COD" AND
"M"."VPD_COUNTRY"="S"."CTL_COUNTRY")
29 - filter(COALESCE("S"."END_VAL_DAT",TO_DATE(:SYS_B_25,:SYS_B_26))>=SYSDATE@!)
30 - access("M"."KEY_TYPE"=:SYS_B_22 AND "M"."VPD_COUNTRY"=:SYS_B_27)Edited by: BluShadow on 20-Jun-2012 10:30
added {noformat}{noformat} tags for readability. Please read {message:id=9360002} and learn to do this yourself.yes, all the used tables are analyzed.
Thanks, for pointing to the metalink bug, I have also searched in metalink, but didn't find this bug.
I have a little bit more information about the problem.
I use the following select (now in a readable format)
select count (1)
from ( (select 'different (target)' compare_type
,t.id_org_addr id_org_addr -- ID_ORG_ADDR
,t.vpd_country vpd_country -- CTL_COUNTRY
,t.addr_type addr_type -- ADDRESSTYP_COD
from (select *
from (select t.*
from ods.ods_org_addr t
left outer join
m_sy_foreign_key m
on m.vpd_country = t.vpd_country
and m.key_type = 'ORGADDR2'
and m.target_value = t.id_org_addr
where coalesce (t.end_date, to_date ('99991231', 'yyyymmdd')) >= sysdate) /*SGRB*/
where vpd_country = 'NL' /*EGRB*/
) t
where exists
(select null
from (select *
from (select m.target_value id_org_addr
,s.wkp_id_cegedim || '-' || s.adr_id_cegedim || '-' || s.addresstyp_cod id_cegedim
,s.*
from okc_mdl_workplace_address s
left outer join
m_sy_foreign_key m
on m.vpd_country = s.ctl_country
and m.key_type = 'ORGADDR2'
and m.source_value = s.wkp_id_cegedim || '-' || s.adr_id_cegedim || '-' || s.addresstyp_cod
where coalesce (s.end_val_dat, to_date ('99991231', 'yyyymmdd')) >= sysdate) /*SGRB*/
where ctl_country = 'NL' /*EGRB*/
) s
where t.id_org_addr = s.id_org_addr)
minus
select 'different (target)' compare_type
,s.id_org_addr id_org_addr -- ID_ORG_ADDR
,s.ctl_country vpd_country -- CTL_COUNTRY
, (select to_number (l.target_value)
from okc_code_foreign l
where l.source_code_type = 'TYS'
and l.target_code_type = 'ADDRLINKTYPE'
and l.source_value = upper (s.addresstyp_cod)
and l.vpd_country = s.ctl_country)
addr_type -- ADDRESSTYP_COD
from (select *
from (select m.target_value id_org_addr
,s.wkp_id_cegedim || '-' || s.adr_id_cegedim || '-' || s.addresstyp_cod id_cegedim
,s.*
from okc_mdl_workplace_address s
left outer join
m_sy_foreign_key m
on m.vpd_country = s.ctl_country
and m.key_type = 'ORGADDR2'
and m.source_value = s.wkp_id_cegedim || '-' || s.adr_id_cegedim || '-' || s.addresstyp_cod
where coalesce (s.end_val_dat, to_date ('99991231', 'yyyymmdd')) >= sysdate) /*SGRB*/
where ctl_country = 'NL' /*EGRB*/
) s)) The select is executed in 813 msecs
When I execute the same select using execute immediate like:
declare
ln_count number;
begin
execute immediate q'[<select statement>]' into ln_count;
end;This takes 3:56 minutes to complete.
When I change the second coalesce part (the one within the exists) in the flowing way:
the part
coalesce (s.end_val_dat, to_date ('99991231', 'yyyymmdd')) >= sysdate
is replaced by
s.end_val_dat >= sysdate or s.end_val_dat is nullthen the execution time is even faster (560 msecs) in both, the plain select and the select using the execute immediate. -
Can the file extension be removed when using the File Special Field?
I placed a Special Field on the top of my topics that gives
me the filename for that specific topic. I used the Insert >
Field > Special Field > File and the File Name is inserted
where I want it. The problem is that i don't want the file
extension to show, just the name. I can't figure out how to do
this...Any suggestions....anyone? Maybe a Script or an applet? Any
input will be greatly appreciated...Thank you for your response. I did try something similar to
what you suggested but the problem is that whatever I modify it
automatically puts it back like it was as soon as I try to save it.
This is probably because I used the Insert Special Field commands
from the Menu. It automatically places this on the code:
<!--kadov_tag{{<variable name=file
x-format=default
x-value=017.htm>}}-->017.htm<!--kadov_tag{{</variable>}}-->
Where 017.htm is the actual name of the file for the specific
topic. This means that it's apparently doing this on the background
somewhere I can't see and inserting the value on the code for the
topic...
I hope that made sense! Again, I thank you in advance for any
suggestions... -
I normally have my Yahoo! and Gmail e-mail accounts open and logged into the entire time my system is on. Using your instructions, I have set theFireFox 'mailto' option to USE GMAIL.
Seems Gmail automatically logged me out. When I logged back in and tried again, it still opened a new window, but used my open account to process the e-mail.
Must it always open a new window (which then needs to be closed) in executing this function? -
When using my power adapter, the light is green for a two seconds and turns red.
When using the power adapter for my MacBook 2.1, the little light on the connector is green for 2 or 3 seconds and then turns red. I bought a new adapter, but it is the same thing. It looks like the computer is charging. What's wrong with it.
Nothing wrong. Here is what Macbook manual has to say:
When you first connect the power adapter to your computer, an indicator light on the
power adapter plug starts to glow. An amber light indicates that power is going to the
battery. A green light indicates that no power is going to the battery, which can mean
the battery is fully charged, is not installed, or has a problem. If you don’t see a light,
your plug probably isn’t seated correctly. Check for any debris and remove it. You can
monitor the battery level using the Battery status menu in the menu bar or by
checking the battery level indicator lights on the bottom of the battery (see page 74). -
"Performance" problems with the File adapter on Plain J2SE Adapter Engine
Hi,
At the moment I'm on a customer side to solve some XI issues for a few days. One of the issues is the performance of the Plain J2SE Adapter Engine, using the file adapter to transfer XML messages(already XI message format) from the legacy system to the Integration Engine. The File adapter has to deal with "large" XML messages(max at the moment is 65 Mb) and the engine fails with the following error when transferring the big XML file: "ERROR: Finished sending to Integration Engine with error "java.lang.OutOfMemoryError". Skip confirmation and quit this loop".
As far I got the information from the customer the memory use of the Plain adapter engine is set to 512Mb. This is maybe to low. But I don't know where to look for this, I only have the adapter web interface in front of me, no access to the OS it self via for example remote connection.
On the Integration Engine I know there is the ability to split large message with the file adapter(File Content Conversion), but I don't know this for the Plain Adapter Engine. Is there a possibility to do this also on the Plain Adapter Engine?
Thanks in advance for any input.
Greetings,
PatrickHi Sameer,
Thanks for your answers.
On the first solution, yes that is possible, we first decided to see if the legacy system can do the splitting, before starting developing a Java program.
On the second solution, as far as I know is this solution possible on the Integration Engine. But we are facing the problems on the Plain J2SE Adapter Engine. I went trough that documentation(link:
http://help.sap.com/saphelp_nw04/helpdata/en/6f/246b3de666930fe10000000a114084/frameset.htm ), to look for I similiar solution in the Plain Adapter Engine. So my question is, is this possible with the Plain Adapter? And if so, what kind of parameters I need to use to achieve this.
Regards,
Patrick
Maybe you are looking for
-
ITunes 11 - Get Info greyed out for movies on external drive
With iTunes 10, my itunes movies and TV shows grew to about 1.2TB of data, and I shifted them to an external drive. I was able to edit the information via 'get info' until upgrading to iTunes 11. Now all the information, bar the 'five stars' field ar
-
Apple TV Picture pixulated or freezes since update...
Hello I have 2 Apple TVs and BOTH are doing the same thing. I have BOTH connected with an Ethernet cable through a (Gigabit) switch. I have a superfast network and can stream anything from the Internet in lightening speed. When I play videos on my
-
Hi, is it at all possible for some to have the same email or apple/iTunes account email? I ask because my email address comes up with a different surname to that of mine. Thanks
-
NEW mac freezing and slow after opening iphoto a week ago
I purchased the macbook pro 2 months ago and last week i put in a memory card and uplaoded pictures into iphoto. after i took the memory card out my computer would freeze and stall nonstop. It would work really slow and overnight i shut it down, the
-
Unusual Apache error_log entries
My Apache error_log (and my wife's) are full of these: [Mon Feb 04 13:19:31 2013] [error] [client 127.0.0.1] File does not exist: /Library/WebServer/Documents/crls [Mon Feb 04 13:19:31 2013] [error] [client 127.0.0.1] File does not exist: /Library/We