Classification of IP SLA
Hello peeps!
I'm wondering who would I classify IP SLA packets?
I have ip sla going from 192.168.1.1 to 192.168.1.2
Voice, ftp and streaming are all classified and LLQ in place
But IP SLA fails because of high volume on the serial line
Hi
You can do a couple of things:
1) Use the 'tos' command under IP SLA config to set it into the DSCP based classes (example:http://www.ccde-study.com/2008/11/ip-sla-dscp-testing.html)
2) Ensure that your WAN QoS classifications include the IP SLA streams specifically.
Regards
Aaron
Please rate helpful posts..
Similar Messages
-
Generating Supplier Liability Account based on the Supplier Classification
Hi ,
In R12 I have a custom requirement like follows. Need to generate the Supplier Liability Account based on the Supplier Classification. When we select supplier type then by default liabilty account value will be the value from supplier--> setup --> options --> fianancial otpions --> accounting tab (GL -accounts liability value).
so when we change the supplier type the liability account should changed based on that classification value.
I have code in 11i where custom.pll was modified to acheive this, but how to implement this in R12 as supplier form is OAF based.
the code from custom.pll for this is
procedure event(event_name varchar2)
is
lCurrBlock varchar2(30);
lVendorType varchar2(30);
lDefaultCCID number(15);
lCustomCCID number(15);
cursor customCC(x_vendor_type varchar2,x_default_cc_id number) is
select
cst.code_combination_id
from
fnd_lookup_values vty,
gl_code_combinations dfl,
gl_code_combinations cst
where 1=1
and vty.view_application_id = 201
and vty.lookup_type = 'VENDOR TYPE'
and vty.language = 'US'
and vty.lookup_code = x_vendor_type
and dfl.code_combination_id = x_default_cc_id
and dfl.segment1 = cst.segment1
and vty.tag = cst.segment2 --account no
and dfl.segment3 = cst.segment3
and dfl.segment4 = cst.segment4
and dfl.segment5 = cst.segment5
and (dfl.segment6 = cst.segment6 or (dfl.segment6 is null and cst.segment6 is null))
and (dfl.segment7 = cst.segment7 or (dfl.segment7 is null and cst.segment7 is null))
and (dfl.segment8 = cst.segment8 or (dfl.segment8 is null and cst.segment8 is null))
and (dfl.segment9 = cst.segment9 or (dfl.segment9 is null and cst.segment9 is null))
and (dfl.segment10 = cst.segment10 or (dfl.segment10 is null and cst.segment10 is null))
and (dfl.segment11 = cst.segment11 or (dfl.segment11 is null and cst.segment11 is null))
and (dfl.segment12 = cst.segment12 or (dfl.segment12 is null and cst.segment12 is null))
and (dfl.segment13 = cst.segment13 or (dfl.segment13 is null and cst.segment13 is null))
and (dfl.segment14 = cst.segment14 or (dfl.segment14 is null and cst.segment14 is null))
and (dfl.segment15 = cst.segment15 or (dfl.segment15 is null and cst.segment15 is null))
and (dfl.segment16 = cst.segment16 or (dfl.segment16 is null and cst.segment16 is null))
and (dfl.segment17 = cst.segment17 or (dfl.segment17 is null and cst.segment17 is null))
and (dfl.segment18 = cst.segment18 or (dfl.segment18 is null and cst.segment18 is null))
and (dfl.segment19 = cst.segment19 or (dfl.segment19 is null and cst.segment19 is null))
and (dfl.segment20 = cst.segment20 or (dfl.segment20 is null and cst.segment20 is null))
and (dfl.segment21 = cst.segment21 or (dfl.segment21 is null and cst.segment21 is null))
and (dfl.segment22 = cst.segment22 or (dfl.segment22 is null and cst.segment22 is null))
and (dfl.segment23 = cst.segment23 or (dfl.segment23 is null and cst.segment23 is null))
and (dfl.segment24 = cst.segment24 or (dfl.segment24 is null and cst.segment24 is null))
and (dfl.segment25 = cst.segment25 or (dfl.segment25 is null and cst.segment25 is null))
and (dfl.segment26 = cst.segment26 or (dfl.segment26 is null and cst.segment26 is null))
and (dfl.segment27 = cst.segment27 or (dfl.segment27 is null and cst.segment27 is null))
and (dfl.segment28 = cst.segment28 or (dfl.segment28 is null and cst.segment28 is null))
and (dfl.segment29 = cst.segment29 or (dfl.segment29 is null and cst.segment29 is null))
and (dfl.segment30 = cst.segment30 or (dfl.segment30 is null and cst.segment30 is null));
begin
if name_in('SYSTEM.CURRENT_FORM') != 'APXVDMVD' then
return;
end if;
lCurrBlock := name_in('SYSTEM.CURSOR_BLOCK');
if event_name = 'WHEN-NEW-FORM-INSTANCE' then
-- app_item_property.set_property('VNDR.VENDOR_TYPE_DISP_MIR',required,property_true);
copy(name_in('WORLD.ACCTS_PAY_CODE_COMBINATION_ID'),'GLOBAL.XX_APXVDMVD_ACCTS_PAY_CC_ID');
elsif lCurrBlock = 'VNDR' and event_name in ('WHEN-VALIDATE-RECORD','WHEN-NEW-RECORD-INSTANCE') then
open customCC(name_in('VNDR.VENDOR_TYPE_LOOKUP_CODE'),name_in('GLOBAL.XX_APXVDMVD_ACCTS_PAY_CC_ID'));
fetch customCC into lCustomCCID;
close customCC;
if lCustomCCID is null then
copy(name_in('GLOBAL.XX_APXVDMVD_ACCTS_PAY_CC_ID'),'WORLD.ACCTS_PAY_CODE_COMBINATION_ID');
else
copy(lCustomCCID,'WORLD.ACCTS_PAY_CODE_COMBINATION_ID');
end if;
end if;
end;
end;
please helpHi,
While you can refer to the SLA guide, for quick pointers what you need to do is:
1. Identify the transaction types and associated events for that transaction.
2. Build an accounting rule for the above.
An example would be AP invoice validation or application of prepayments.
The SLA guide provides step by step details on the above.
Good Luck !!!
Regards,
Udit -
CRM 4; Utilities Add On; Service Ticket; SLA
Hello Experts,
We are implementing CRM4, service industry with utilities add-on.
We are using the service ticket for service request creation, applying the four tier classification of call types.
Each call type has an SLA timeframe for completion.
The SLAs are not product nor contract dependant - and only relate the the type of service stipulated in the service ticket.
I have set up the date profile etc.
I would assume that a dummy product and dummy contract will have to be created to invoke the SLAs but I am not managing to get the Service Ticket to see the response times.
Please could you point me in the right direction? i.e. How can I create an SLA against each code (category; code group; subject profile)? (without using service contracts?)
Much appreciated!Hi Tanya,
We have got the escaltion time picking in the ticket via BADI.
This BADI is called on Saving the ticket, which checks the code and then brings the code related SLa time form the response profile (which is also assigned to the product -Dummy one) and then checks for the Date profile date type date rule if any calculation is to be done it does that and then populates the times.
I think the need for contract is arising only because in the dummy product master of contract type only we can attach the reponse profiles ...you can actually add the set type to your service product(Dummy) and then try this .I'm sure it will work .
Works for us.
Regards
Raj -
Multiple customers, different SLA's
Hi,
I have a question about the configuration of SCSM Incident Management in the following situation.
We have 6 different customers, who all send their incidents to our service desk.
The Service level agreements are different for these 6 customers.
For example,
a priority 3 incident for Customer1 needs to be resolved within 4 working days.
a priority 3 incident for Customer2 needs to be resolved within 5 working days.
and so on.
What's the best way to configure this in SCSM, so that the correct SLA times are measured for each customer?
We tried to do this using the Service Level Objectives, so based on Calendar, Queue and Metric.
For example:
ServiceLevelObjective1: 'Customer1 - Priority 2' Metric Resolution time: 16 hours
ServiceLevelObjective2: 'Customer1 - Priority 3' Metric Resolution time: 32 hours
ServiceLevelObjective3: 'Customer2 - Priority 2' Metric Resolution time: 8 hours
ServiceLevelObjective4: 'Customer2 - Priority 3' Metric Resolution time: 24 hours
We thought it would be possible to assign an incident to a queue, so that the SLO which is configured for a queue would get effective. But, for some reason I do not see the option to assign an incident to a queue.
I look forward how SCSM handles this, and what is advised to configure this best in SCSM.
Thanks in advance for your help!
Kind regards
EmileYou have different options to reach your goal.
We implemented the folliowng 3 approaches at different customers:
1.
Create different Support Groups/Tier Queues for each customer:
Customer 1/Tier 1
Customer 1/Tier 2
Customer 1/Tier 3
Customer 2/Tier 1
Customer 2/Tier2
Choose different Tier Queues/Support Groups and Priority to build the Queues.
Configure different SLOs for different Queues with different Target Resolution Times.
Choose the matching Tier Queue/Support Group for each customer in Incident ticket.
2.
Create Incident Classifications for each customer:
Customer 1/Mail Problem
Customer 1/Whatever Problem
Customer 2/Mail Problem
Choose different Classification Categories and Priority to build the Queues.
Configure different SLOs for different Queues with different Target Resolution Times.
Choose the matching Classification Catagory for each customer in Incident ticket.
3.
Extend the Incident class with an enumeration list, for instance "Customers".
Add the values Customer 1, Customer 2, .. to the list.
Customize the Incident form and add the new list box.
Choose different list values of the Customers property and Priority to build the Queues.
Configure different SLOs for different Queues with different Target Resolution Times.
Choose matching customer from the list box in each Incident ticket.
Hope this helps and one approach work for you.
Andreas Baumgarten | H&D International Group -
IP SLA/EEM running out of VTY lines and failing
I am using IP SLA to ping network devices to detect network failures from an AS5400XM voice gateway. The AS5400XM platform is limited to 5 vty lines ( vty 0 4). When simulating a simultaneous outage, there are not enough tty lines available to process my EEM events.
However, when simulating a simultaneous outage, we have a new issue – there are not enough lines available:
Feb 9 15:16:22.970 CST: %HA_EM-3-FMPD_CLI_CONNECT: Unable to establish CLI session: no tty lines available, minimum of 2 required by EEM
Feb 9 15:16:22.970 CST: %HA_EM-3-FMPD_ERROR: Error executing applet ReportIPSLAevent_1005065020_up statement 1.1
Feb 9 15:16:22.974 CST: %HA_EM-3-FMPD_CLI_CONNECT: Unable to establish CLI session: no tty lines available, minimum of 2 required by EEM
Feb 9 15:16:22.978 CST: %HA_EM-3-FMPD_ERROR: Error executing applet ReportIPSLAevent_1005081020_up statement 1.1
Feb 9 15:16:22.986 CST: %HA_EM-3-FMPD_CLI_CONNECT: Unable to establish CLI session: no tty lines available, minimum of 2 required by EEM
Feb 9 15:16:22.986 CST: %HA_EM-3-FMPD_ERROR: Error executing applet ReportIPSLAevent_100000226_up statement 1.1
Feb 9 15:16:22.994 CST: %HA_EM-3-FMPD_CLI_CONNECT: Unable to establish CLI session: no tty lines available, minimum of 2 required by EEM
Feb 9 15:16:22.994 CST: %HA_EM-3-FMPD_ERROR: Error executing applet ReportIPSLAevent_1001012021_up statement 1.1
Feb 9 15:16:23.006 CST: %HA_EM-3-FMPD_CLI_CONNECT: Unable to establish CLI session: no tty lines available, minimum of 2 required by EEM
Feb 9 15:16:23.006 CST: %HA_EM-3-FMPD_ERROR: Error executing applet ReportIPSLAevent_1061247018_up statement 1.1
How can I detect/wait until there are enough lines free before processing the EEM rules?
Regards,
-DougJoe,
So far, it looks like there are only two issues remaining:
Issue #1 - E-mail subject has destination listed as "unknown" versus the hostname
Feb 18 10:39:33.297 CST: %HA_EM-6-LOG: sl_ip_sla_report.tcl : DEBUG(smtp_lib) : smtp_write : Subject: IP SLA alert - {router} connectivity to Unknown has been restored
Issue #2 - Source IP address issue (SMTP relay restricted to IP 10.5.32.90). Traffic to the SMTP server needs to sourced from the Loopback0 address 10.5.32.90. There are NAT rules in place to cover this, but it looks like the TCL script is bypassing the NAT:
#sh ip int brief | ex una
Interface IP-Address OK? Method Status Protocol
GigabitEthernet0/0 10.5.34.242 YES NVRAM up up
GigabitEthernet0/1 10.5.34.250 YES NVRAM up up
Loopback0 10.5.32.90 YES NVRAM up up
NVI0 10.5.32.90 YES unset up up
ip nat inside source list smtp-nat interface Loopback0 overload
sh ip access-lists smtp-nat
Extended IP access list smtp-nat
10 permit ip host 10.5.34.242 host 10.0.10.10 (2 matches)
20 permit ip host 10.5.34.250 host 10.0.10.10 (15 matches)
Feb 18 10:39:33.297 CST: FIBfwd-proc: sending link IP ip_pak_table 0 ip_nh_table 65535 if GigabitEthernet0/0 nh 10.5.34.241 uhp 1 deag 0 ttlexp 0 rec 0
Feb 18 10:39:33.297 CST: IP: s=10.5.34.242 (local), d=10.0.10.10 (GigabitEthernet0/0), len 58, sending
Feb 18 10:39:33.297 CST: TCP src=26095, dst=25, seq=3453704840, ack=2112864439, win=3890 ACK
Feb 18 10:39:33.297 CST: IP: s=10.5.34.242 (local), d=10.0.10.10 (GigabitEthernet0/0), len 58, output feature
Feb 18 10:39:33.297 CST: TCP src=26095, dst=25, seq=3453704840, ack=2112864439, win=3890 ACK, CCE Output Classification(5), rtype 1, forus FALSE, sendself FALSE, mtu 0, fwdchk FALSE
Feb 18 10:39:33.297 CST: IP: s=10.5.32.90 (local), d=10.0.10.10 (GigabitEthernet0/0), len 58, output feature
Feb 18 10:39:33.297 CST: TCP src=26095, dst=25, seq=3453704840, ack=2112864439, win=3890 ACK, Post-routing NAT Outside(17), rtype 1, forus FALSE, sendself FALSE, mtu 0, fwdchk FALSE
Feb 18 10:39:33.297 CST: IP: s=10.5.32.90 (local), d=10.0.10.10 (GigabitEthernet0/0), len 58, output feature
Feb 18 10:39:33.297 CST: TCP src=26095, dst=25, seq=3453704840, ack=2112864439, win=3890 ACK, Stateful Inspection(20), rtype 1, forus FALSE, sendself FALSE, mtu 0, fwdchk FALSE
Feb 18 10:39:33.297 CST: IP: s=10.5.32.90 (local), d=10.0.10.10 (GigabitEthernet0/0), len 58, sending full packet
Feb 18 10:39:33.297 CST: TCP src=26095, dst=25, seq=3453704840, ack=2112864439, win=3890 ACK
Feb 18 10:39:33.297 CST: [fh_smtp_debug_cmd]
Feb 18 10:39:33.297 CST: %HA_EM-6-LOG: sl_ip_sla_report.tcl : DEBUG(smtp_lib) : smtp_write : From: {router}@mydomain.com
Feb 18 10:39:33.297 CST: %HA_EM-6-LOG: sl_ip_sla_report.tcl : DEBUG(smtp_lib) : smtp_write : To: [email protected]
Feb 18 10:39:33.297 CST: %HA_EM-6-LOG: sl_ip_sla_report.tcl : DEBUG(smtp_lib) : smtp_write : Cc:
Feb 18 10:39:33.297 CST: %HA_EM-6-LOG: sl_ip_sla_report.tcl : DEBUG(smtp_lib) : smtp_write : Subject: IP SLA alert - {router} connectivity to Unknown has been restored
Feb 18 10:39:33.297 CST: %HA_EM-6-LOG: sl_ip_sla_report.tcl : DEBUG(smtp_lib) : smtp_write : IPSLAs Latest Operation Statistics
Feb 18 10:39:33.297 CST: %HA_EM-6-LOG: sl_ip_sla_report.tcl : DEBUG(smtp_lib) : smtp_write : IPSLA operation id: 100000226
Feb 18 10:39:33.297 CST: %HA_EM-6-LOG: sl_ip_sla_report.tcl : DEBUG(smtp_lib) : smtp_write : Type of operation: icmp-echo
Feb 18 10:39:33.297 CST: %HA_EM-6-LOG: sl_ip_sla_report.tcl : DEBUG(smtp_lib) : smtp_write : Latest RTT: 36 milliseconds
Feb 18 10:39:33.297 CST: %HA_EM-6-LOG: sl_ip_sla_report.tcl : DEBUG(smtp_lib) : smtp_write : Latest operation start time: 10:39:29.377 CST Thu Feb 18 2010
Feb 18 10:39:33.297 CST: %HA_EM-6-LOG: sl_ip_sla_report.tcl : DEBUG(smtp_lib) : smtp_write : Latest operation return code: OK
Feb 18 10:39:33.297 CST: %HA_EM-6-LOG: sl_ip_sla_report.tcl : DEBUG(smtp_lib) : smtp_write : Number of successes: 21
Feb 18 10:39:33.297 CST: %HA_EM-6-LOG: sl_ip_sla_report.tcl : DEBUG(smtp_lib) : smtp_write : Number of failures: 13
Feb 18 10:39:33.297 CST: %HA_EM-6-LOG: sl_ip_sla_report.tcl : DEBUG(smtp_lib) : smtp_write : Operation time to live: Forever
Feb 18 10:39:33.297 CST: %HA_EM-6-LOG: sl_ip_sla_report.tcl : DEBUG(smtp_lib) : smtp_write : {router}
Feb 18 10:39:33.297 CST: %HA_EM-6-LOG: sl_ip_sla_report.tcl : DEBUG(smtp_lib) : smtp_write :
Feb 18 10:39:33.297 CST: [fh_smtp_debug_cmd]
Feb 18 10:39:33.297 CST: %HA_EM-6-LOG: sl_ip_sla_report.tcl : DEBUG(smtp_lib) : smtp_write : .
Feb 18 10:39:33.529 CST: IP: s=10.0.10.10 (GigabitEthernet0/1), d=10.5.34.242, len 40, input feature
Feb 18 10:39:33.529 CST: TCP src=25, dst=26095, seq=2112864439, ack=3453704858, win=65374 ACK, Stateful Inspection(4), rtype 0, forus FALSE, sendself FALSE, mtu 0, fwdchk FALSE
Feb 18 10:39:33.529 CST: IP: s=10.0.10.10 (GigabitEthernet0/1), d=10.5.34.242, len 40, input feature
Feb 18 10:39:33.529 CST: TCP src=25, dst=26095, seq=2112864439, ack=3453704858, win=65374 ACK, Virtual Fragment Reassembly(21), rtype 0, forus FALSE, sendself FALSE, mtu 0, fwdchk FALSE
Feb 18 10:39:33.529 CST: IP: s=10.0.10.10 (GigabitEthernet0/1), d=10.5.34.242, len 40, input feature
Feb 18 10:39:33.529 CST: TCP src=25, dst=26095, seq=2112864439, ack=3453704858, win=65374 ACK, Virtual Fragment Reassembly After IPSec Decryption(32), rtype 0, forus FALSE, sendself FALSE, mtu 0, fwdchk FALSE
Feb 18 10:39:33.529 CST: IP: s=10.0.10.10 (GigabitEthernet0/1), d=10.5.34.242, len 40, input feature
Feb 18 10:39:33.529 CST: TCP src=25, dst=26095, seq=2112864439, ack=3453704858, win=65374 ACK, NAT Outside(53), rtype 0, forus FALSE, sendself FALSE, mtu 0, fwdchk FALSE
Feb 18 10:39:33.529 CST: IP: s=10.0.10.10 (GigabitEthernet0/1), d=10.5.34.242, len 40, input feature
Feb 18 10:39:33.529 CST: TCP src=25, dst=26095, seq=2112864439, ack=3453704858, win=65374 ACK, MCI Check(64), rtype 0, forus FALSE, sendself FALSE, mtu 0, fwdchk FALSE
Feb 18 10:39:33.529 CST: FIBipv4-packet-proc: route packet from GigabitEthernet0/1 src 10.0.10.10 dst 10.5.34.242
Feb 18 10:39:33.529 CST: FIBfwd-proc: Default:10.5.34.242/32 receive entry
Error:
Feb 18 10:39:33.529 CST: FIBipv4-packet-proc: packet routing failed
Feb 18 10:39:33.529 CST: IP: tableid=0, s=10.0.10.10 (GigabitEthernet0/1), d=10.5.34.242 (GigabitEthernet0/0), routed via RIB
Feb 18 10:39:33.529 CST: IP: s=10.0.10.10 (GigabitEthernet0/1), d=10.5.34.242 (GigabitEthernet0/0), len 40, output feature
Feb 18 10:39:33.529 CST: TCP src=25, dst=26095, seq=2112864439, ack=3453704858, win=65374 ACK ACK
Feb 18 10:39:33.953 CST: IP: s=10.0.10.10 (GigabitEthernet0/1), d=10.5.34.242, len 40, stop process pak for forus packet
Feb 18 10:39:33.953 CST: TCP src=25, dst=26095, seq=2112864535, ack=3453704892, win=65340 ACK
Feb 18 10:39:34.245 CST: IP: s=10.0.10.10 (GigabitEthernet0/1), d=10.5.34.242, len 88, input feature
Feb 18 10:39:34.245 CST: TCP src=25, dst=26095, seq=2112864535, ack=3453704892, win=65340 ACK PSH, Stateful Inspection(4), rtype 0, forus FALSE, sendself FALSE, mtu 0, fwdchk FALSE
Feb 18 10:39:34.245 CST: IP: s=10.0.10.10 (GigabitEthernet0/1), d=10.5.34.242, len 88, input feature
Feb 18 10:39:34.245 CST: TCP src=25, dst=26095, seq=2112864535, ack=3453704892, win=65340 ACK PSH, Virtual Fragment Reassembly(21), rtype 0, forus FALSE, sendself FALSE, mtu 0, fwdchk FALSE
Feb 18 10:39:34.249 CST: IP: s=10.0.10.10 (GigabitEthernet0/1), d=10.5.34.242, len 88, input feature
Feb 18 10:39:34.249 CST: TCP src=25, dst=26095, seq=2112864535, ack=3453704892, win=65340 ACK PSH, Virtual Fragment Reassembly After IPSec Decryption(32), rtype 0, forus FALSE, sendself FALSE, mtu 0, fwdchk FALSE
Feb 18 10:39:34.249 CST: IP: s=10.0.10.10 (GigabitEthernet0/1), d=10.5.34.242, len 88, input feature
Feb 18 10:39:34.249 CST: TCP src=25, dst=26095, seq=2112864535, ack=3453704892, win=65340 ACK PSH, NAT Outside(53), rtype 0, forus FALSE, sendself FALSE, mtu 0, fwdchk FALSE
Feb 18 10:39:34.249 CST: IP: s=10.0.10.10 (GigabitEthernet0/1), d=10.5.34.242, len 88, input feature
Feb 18 10:39:34.249 CST: TCP src=25, dst=26095, seq=2112864535, ack=3453704892, win=65340 ACK PSH, MCI Check(64), rtype 0, forus FALSE, sendself FALSE, mtu 0, fwdchk FALSE
Feb 18 10:39:34.249 CST: FIBipv4-packet-proc: route packet from GigabitEthernet0/1 src 10.0.10.10 dst 10.5.34.242
Feb 18 10:39:34.249 CST: FIBfwd-proc: Default:10.5.34.242/32 receive entry
Error:
Feb 18 10:39:34.249 CST: FIBipv4-packet-proc: packet routing failed
Feb 18 10:39:34.249 CST: IP: tableid=0, s=10.0.10.10 (GigabitEthernet0/1), d=10.5.34.242 (GigabitEthernet0/0), routed via RIB
Feb 18 10:39:34.249 CST: IP: s=10.0.10.10 (GigabitEthernet0/1), d=10.5.34.242 (GigabitEthernet0/0), len 88, output feature
Feb 18 10:39:34.249 CST: TCP src=25, dst=26095, seq=2112864535, ack=3453704892, win=65340 ACK PSH, CCE Output Classification(5), rtype 1, forus FALSE, sendself FALSE, mtu 0, fwdchk FALSE
Feb 18 10:39:34.249 CST: IP: s=10.0.10.10 (GigabitEthernet0/1), d=10.5.34.242 (GigabitEthernet0/0), len 88, output feature
Feb 18 10:39:34.249 CST: TCP src=25, dst=26095, seq=2112864535, ack=3453704892, win=65340 ACK PSH, Post-routing NAT Outside(17), rtype 1, forus FALSE, sendself FALSE, mtu 0, fwdchk FALSE
Feb 18 10:39:34.249 CST: IP: s=10.0.10.10 (GigabitEthernet0/1), d=10.5.34.242 (GigabitEthernet0/0), len 88, output feature
Feb 18 10:39:34.249 CST: TCP src=25, dst=26095, seq=2112864535, ack=3453704892, win=65340 ACK PSH, Stateful Inspection(20), rtype 1, forus FALSE, sendself FALSE, mtu 0, fwdchk FALSE
Feb 18 10:39:34.249 CST: IP: s=10.0.10.10 (GigabitEthernet0/1), d=10.5.34.242, len 88, rcvd 4
Feb 18 10:39:34.249 CST: TCP src=25, dst=26095, seq=2112864535, ack=3453704892, win=65340 ACK PSH
Feb 18 10:39:34.249 CST: IP: s=10.0.10.10 (GigabitEthernet0/1), d=10.5.34.242, len 88, stop process pak for forus packet
Feb 18 10:39:34.249 CST: TCP src=25, dst=26095, seq=2112864535, ack=3453704892, win=65340 ACK PSH
Feb 18 10:39:34.249 CST: IP: s=10.0.10.10 (GigabitEthernet0/1), d=10.5.34.242, len 40, input feature
Feb 18 10:39:34.249 CST: TCP src=25, dst=26095, seq=2112864583, ack=3453704892, win=65340 ACK FIN, Stateful Inspection(4), rtype 0, forus FALSE, sendself FALSE, mtu 0, fwdchk FALSE
Feb 18 10:39:34.249 CST: IP: s=10.0.10.10 (GigabitEthernet0/1), d=10.5.34.242, len 40, input feature
Feb 18 10:39:34.249 CST: TCP src=25, dst=26095, seq=2112864583, ack=3453704892, win=65340 ACK FIN, Virtual Fragment Reassembly(21), rtype 0, forus FALSE, sendself FALSE, mtu 0, fwdchk FALSE
Feb 18 10:39:34.249 CST: IP: s=10.0.10.10 (GigabitEthernet0/1), d=10.5.34.242, len 40, input feature
Feb 18 10:39:34.249 CST: TCP src=25, dst=26095, seq=2112864583, ack=3453704892, win=65340 ACK FIN, Virtual Fragment Reassembly After IPSec Decryption(32), rtype 0, forus FALSE, sendself FALSE, mtu 0, fwdchk FALSE
Feb 18 10:39:34.249 CST: IP: s=10.0.10.10 (GigabitEthernet0/1), d=10.5.34.242, len 40, input feature
Feb 18 10:39:34.249 CST: TCP src=25, dst=26095, seq=2112864583, ack=3453704892, win=65340 ACK FIN, NAT Outside(53), rtype 0, forus FALSE, sendself FALSE, mtu 0, fwdchk FALSE
Feb 18 10:39:34.249 CST: IP: s=10.0.10.10 (GigabitEthernet0/1), d=10.5.34.242, len 40, input feature
Feb 18 10:39:34.249 CST: TCP src=25, dst=26095, seq=2112864583, ack=3453704892, win=65340 ACK FIN, MCI Check(64), rtype 0, forus FALSE, sendself FALSE, mtu 0, fwdchk FALSE
Feb 18 10:39:34.249 CST: FIBipv4-packet-proc: route packet from GigabitEthernet0/1 src 10.0.10.10 dst 10.5.34.242
Feb 18 10:39:34.249 CST: FIBfwd-proc: Default:10.5.34.242/32 receive entry
Error:
Feb 18 10:39:34.249 CST: FIBipv4-packet-proc: packet routing failed
Feb 18 10:39:34.249 CST: IP: tableid=0, s=10.0.10.10 (GigabitEthernet0/1), d=10.5.34.242 (GigabitEthernet0/0), routed via RIB
Feb 18 10:39:34.249 CST: IP: s=10.0.10.10 (GigabitEthernet0/1), d=10.5.34.242 (GigabitEthernet0/0), len 40, output feature
Feb 18 10:39:34.249 CST: TCP src=25, dst=26095, seq=2112864583, ack=3453704892, win=65340 ACK FIN, CCE Output Classification(5), rtype 1, forus FALSE, sendself FALSE, mtu 0, fwdchk FALSE
Feb 18 10:39:34.249 CST: IP: s=10.0.10.10 (GigabitEthernet0/1), d=10.5.34.242 (GigabitEthernet0/0), len 40, output feature
Feb 18 10:39:34.249 CST: TCP src=25, dst=26095, seq=2112864583, ack=3453704892, win=65340 ACK FIN, Post-routing NAT Outside(17), rtype 1, forus FALSE, sendself FALSE, mtu 0, fwdchk FALSE
Feb 18 10:39:34.249 CST: IP: s=10.0.10.10 (GigabitEthernet0/1), d=10.5.34.242 (GigabitEthernet0/0), len 40, output feature
Feb 18 10:39:34.249 CST: TCP src=25, dst=26095, seq=2112864583, ack=3453704892, win=65340 ACK FIN, Stateful Inspection(20), rtype 1, forus FALSE, sendself FALSE, mtu 0, fwdchk FALSE
Feb 18 10:39:34.249 CST: IP: s=10.0.10.10 (GigabitEthernet0/1), d=10.5.34.242, len 40, rcvd 4
Feb 18 10:39:34.249 CST: TCP src=25, dst=26095, seq=2112864583, ack=3453704892, win=65340 ACK FIN
Feb 18 10:39:34.249 CST: IP: s=10.0.10.10 (GigabitEthernet0/1), d=10.5.34.242, len 40, stop process pak for forus packet
Feb 18 10:39:34.249 CST: TCP src=25, dst=26095, seq=2112864583, ack=3453704892, win=65340 ACK FIN
Feb 18 10:39:34.249 CST: IP: s=10.5.34.242 (local), d=10.0.10.10, len 40, local feature
Feb 18 10:39:34.249 CST: TCP src=26095, dst=25, seq=3453704892, ack=2112864584, win=3746 ACK, NAT(2), rtype 0, forus FALSE, sendself FALSE, mtu 0, fwdchk FALSE
Feb 18 10:39:34.249 CST: FIBipv4-packet-proc: route packet from (local) src 10.5.34.242 dst 10.0.10.10
Feb 18 10:39:34.249 CST: FIBfwd-proc: Default:10.0.0.0/20 proces level forwarding
Feb 18 10:39:34.249 CST: FIBfwd-proc: depth 0 first_idx 0 paths 2 long 0(0)
Feb 18 10:39:34.249 CST: FIBfwd-proc: try path 0 (of 2) v4-anh-10.5.34.241-Gi0/0 first short ext 0(-1)
Feb 18 10:39:34.249 CST: FIBfwd-proc: v4-anh-10.5.34.241-Gi0/0 valid
Feb 18 10:39:34.249 CST: FIBfwd-proc: ip_pak_table 0 ip_nh_table 65535 if GigabitEthernet0/0 nh 10.5.34.241 deag 0 via fib 0 path type attached nexthop
Feb 18 10:39:34.249 CST: FIBfwd-proc: packet routed to GigabitEthernet0/0 10.5.34.241(0)
Feb 18 10:39:34.249 CST: FIBipv4-packet-proc: packet routing succeeded ACK
Feb 18 10:39:34.405 CST: IP: s=10.0.10.10 (GigabitEthernet0/1), d=10.5.34.242, len 40, stop process pak for forus packet
Feb 18 10:39:34.405 CST: TCP src=25, dst=26095, seq=2112864584, ack=3453704893, win=65340 ACK
Regards,
-Doug
P.S. Can you recommend any documents or books to learn Tcl? -
No key field found for creation of DataSource - Classification Datasource
Hello,
While trying to create a clasiffication datasource based on 0PLANT_ATTR, when i assign a characteristic and push the DataSource button, i get the following message:
No key field found for creation of DataSource
Diagnosis
During generation of a classification or configuration DataSource, only those key fields for the object table (field "Obj.Tabelle") that are already present in the basis DataSource transferred to the extract structure. This is the case when none of the key fields of the object table were found in the structure of the basis DataSource.
System response
A DataSource cannot be created without key fields. The action was cancelled.
Procedure
Check whether you have selected the correct basis DataSource and object table. For more information, please see SAP Note 569849.
Do you know what can be the problem?
Thank you and regardsHi Alberto,
plants are a special case. The key which is used for the classification
of plants (object type BETR) is not the same as the key which is used in
datasource 0PLANT_ATTR.
BETR has key LOCNR (Site). That's a customer related to a plant. The
customer number will be extracted in field LOCNR.
0PLANT_ATTR extracts the plant in its key field WERKS.
Transaction CTBW and the generic extraction program for classifications
don't know the relationship between LOCNR and WERKS. So they cannot map
them.
I do recommend a solution which would add the mapping between
LOCNR and WERKS:
1. Create please the classification datasource as intended, but use
datasource 0RT_LOC_MGR_ATTR as basis datasource. It's the only
datasource of the content where LOCNR is a key field. So
0RT_LOC_MGR_ATTR is used as a dummy here, to allow transaction CTBW to
create the classification datasource. It's not necessary to extract
data with datasource 0RT_LOC_MGR_ATTR.
2. Extend please the extract structure of the created classification
datasource. Add please field WERKS using component type WERKS_D. Make
this field visible.
3. Fill field WERKS in the extractor user exit EXIT_SAPLRSAP_002. WERKS
can be read from table KNA1 by using the customer number extracted to
LOCNR to select on field KNA1-KUNNR.
4. Transaction CTBW_META on the BW system isn't able to append the
characteristics from the classification datasource to infosource
0PLANT, because the keys are different. So create please a new info
source with CTBW_META. This allows CTBW_META to create the info objects
for the characteristics used in the classification datasource.
5. Add please the characteristics used in the classification datasource
to infosource 0PLANT manually. You will find the info object names of
the characteristics by looking up the characteristic datasources which
are assigned to the classification datasource in transaction CTBW. From
these names you can derive the info object names:
1CL_A... -> C_A...
6. Disconnect the infosource which has been created with CTBW_META from
the classification datasource.
7. Connect please the classification datasource to infosource 0PLANT
Use following info object:
info object field
0PLANT WERKS
The info object names for the characteristics are explained in step 5.
8. Add an infopackage to infosource 0PLANT for the classification
datasource.
Now the extraction of classifications of sites should work.
Best regards,
Rolf
P.S. I saw the system messed it up and doesn't display any new and empty lines. Sorry, I hope you still can read it.
Edited by: Rolf Doersam on Mar 26, 2010 6:56 PM -
Copy of Material Master charac. values to the batch classif. in GR for PO
Dear gurus,
Could you please help me with the following issue. I have a material managed in batches, and it has a classification type 023 in material master. I fill one chaacteristic of this classification with some value. Now I want this value to be copied to the batch classification during the creation of new batch while making GR to production order.
Is it possible?Hi Nikolaj,
What I am understand your requirement is like,
You want to fetch the value of Characteristic maintain in Batch class in Material master to the Batches, correct?
But my Friend if you maintain value of characteristic in Batch Class in material master then it will works as a validation.
For Example,
Suppose your Characteristic is Colour and in Material Master Batch classification view you have maintain value as Red.Then system will not allow you any other colour in Batches.You will find that value in Drop Down list.
Regards,
Dhaval -
Creation of Classification View in Material Master
Hi,
I need to create Classification View for a Material. The user would provide the data in the file, example
Material Number MATNR
Class type TCLA-KLART
Class Type description TCLAT-ARTXT
Class Group TCLG-KLAGR
Class Group Description TCLGT-KTEXT
Class M_CLASB-CLASS
Class Description M_CLASB-CSCHG
Status RMCLF-STATU
Std class RMCLF-STDCL
Characteristics Group TCMG-ATKLA
Characteristics Group Description TCMGT-ATKLT
Characteristics RCTAV-ATNAM
Characteristics Description CABNT-ATBEZ
I read in the posts there is BAPI_OBJCL_CREATE. But I am not able to Map the above fields to pass to this BAPI.
Can someone please suggest me how to do this..
Thanks and Regards,
VanessaHi,
Thanks for the suggestion. But i want to do this through a program. There is a selection screen with 6 radio buttons based o view to be created and an input field for the filename.
Could you provide me any BAPI to do the same.
Thanks and Regards,
Vanessa -
Batch managment with classification
Hi,
I have material with batch and classification. Material master have base unit KG, but I need second unit PC or M and other parameters of material. (coefficient between KG and PC is not constant).
I give to classification attributes fields: quantity, length. width, ...
In goods receipt I input batch with classification (quantity, length. width, ...)
Where can I see batch balance with classification ?
Is possible recalculate attribute of classification quantity after goods issue or how I can manage balance of quantity?
thank
betoHi,
Once you fix the classification, i don't think you can recalculate the classification, what do you mean by batch balance?
Aktar -
Classification of IFLOT & EQUI via CLB2 / CLB1
Hi all,
I've used CLB2 to mass populate & update classifications assigned to functional locations & equipments. This has worked well to date.
However, I am now attempting to improve processing performance. While reviewing available functionality, I noted the availability of CLB1 to generate batch input sessions from the logical filename provided & supports essentially the same input file.
Experimentation proves that this transaction is essentially completely unsuitable for general use any longer since it applies the batch input sessions against transactions CL20 & CL22 which are no longer used.
Assuming I have 800,000 updates to apply, is there a standard mechanism other than CLB2, where I configure additional logical filenames to allow streaming the updates in the background.
regards
DanielHi Daniel,
Asumming you are trying to update classification data for IS-U type functional locations and devices, you can update classification data for these using ISMW - migration object OBJCLASS. Although I think this object would work for non-ISU FLs and Equipments too.
Wherein, below fields can be defaulted depending if you are updating classification for FL or DEVICES.
HEADER-OBJECT_TYPE = "IFLOT" or "EQUI".
DATA-FIELD = TPLNR or EQUNR
Other structures and fields are self explanatory. But feel free to ask follow up clarification.
ALternately, LSMW also provides for a standard BI object (0130 classification) that may be utilized as well. Although I have not used LSMW object but have used the ISMW OBJCLASS.
Hope this helps.
Ash -
Automatic batch classification with procedure on goods receipt does not wor
Hi,
I have a material with batch classification.
The material has two characteristics.
One is a number called VALUE.
The other is also a number (called CONVERSION) and shall be deived from the first.
A procedure is created and assigned to the second characteristic.
$SET_DEFAULT ($SELF, CONVERSION , $SELF.VALUE / 6000)
When I post a goods receive (movement Type 501) the characteristics can be filled . The field VALUE is set with 12000.
I was expecting the characteristic CONVERSION to be calculated as '2' and filled by the procedure.
This is not the case.
If however, I try to fill it manually, the system accepts only the calculated value '2'. Otherwise an error is shown.
How can I force the system to calculate and fill the second characteristic when insserting a value in the first ?
Thanks for any help.Hi Marcus,
i have the same problem with movement type 101.
How did you solve this issue?
Can you help me.
Thank you.
CM -
Hi all,
I have implemented basic service desk.
Now i want to implement SLA for the same.
Currently i have config the mailing action depending upon Status change. But now the req is to to trigger the mails depending upon timeline.Which can be done via SLA.
I have gone thru the Service desk additional guide which is available on service.sap but not that much helpful.
Can anyone please explain me how to implement the same.
Points will be rewarded for sure.Dear Prakhar,
If you want to work with SLAs a service product and contracts have to be
used. The relevant dates then are not on header level of a service desk
message but on item level. The actions concerning the SLA have to be
defined on item level.
The processing of the service desk messages is done with the transaction
CRM_DNO_MONITOR. In this transaction monitor fields to the header of a
message as well as to the item of a message are available and can be
displayed.
CRM transaction type SLFN actually is NOT configured for the use with
contracts, SLAs,... . If you want to use these functionalities CRM
transaction type SLF1 or an own defined transaction type should be used.
Availability and Response Times can be maintained with transaction
CRMD_SERV_SLA.
Detailed information about the use and the customizing of Service
Contracts and Service Level Agreements can be found
- in the online documentation to Service Level Agreements
- in the CRM IMG documentation, e.g.:
- Customer Relationship Management
- Transactions
- Settings for Service Processes
- SLA Escalation Management <-
- in the additional informations to the service desk that are to
find in the SAP Service Marketplace under the quicklink
'Solutionmanager'
- Media Library
- Technical Papers
-> Service Desk: Additional Information
Hope this helps explain things.
Regards
Amit -
Service Desk with SLA: item data not automatic
Hello all.
I'm trying to setup service desck with contract determination for monitoring SLA (SolMan 7.0 SP 15)
Everything works well, except for item/product determination. I mean, when i check a support message creatde in a satellite system, organizational data are ok, but item's data are blank. Well, i put manually my own product in Item, quantity, press Enter and... that's all: contract data are correctly fullfilled, SLA schema ok, no errors at all.
I'm sure that i missed some basic configuration, but i don't know where... Maybe in the Action Profile of the ABA message? Or in the Item category determination?
Help me!Hi Michele
Go to SPPFCADM > Select Application DNO_NOTIF >> Define Action Profile and Actions >> Action Profile - SLFN0001_STANDARD_DNO >> Action - SLFN0001_STANDARD_DNO_CRM >> Select the processing type Method Call and choose 'change'icon in Method call settings >>
In the container editor , Choose create >> Maintain Element, Name & Description - maintain as ITEM_PRODUCT_ID, Choose ABAP Dict. Reference , Maintain values - Structure CRMD_ORDERADM_I, Field ORDERED_PROD, choose tab - intial value, enter the product code (Ex.SUPPORT)
enter and save.
This setting will ensure you to determine the product id in the Support message automatically, during your message creation process.
Thanks
Ram -
Batch characteristic in classification not updating
Hello expert,
I have created classification for batch (class type 023) with characteristic MCH1 -LWEDT to record last GR date. Batch number has to be entered manually during GR posting. And i expect that during GR posting, characteristic value of GR date will be automatically updated in the classification but i've found that it is blank.
My query is, do GR date in characteristic value of batch class can be automatically updated during GR posting?
If it yes, what's wrong with my class? and how to make it works as expected?
Really appreciate your help.
Thank you in advanceHi Rahul,
Just to make sure. Have you created a characteristic named MCH1-LWEDT or have you actually linked your characteristic to the field MCH1-LWEDT in the "Addtnl Data" tab in CT04?
BR
Raf
Edited by: Rafael Zaragatzky on Aug 5, 2011 1:03 PM -
Values not getting updated in Classification tab by BAPI_BATCH_SAVE_REPLICA
Hello Experts ,
I am currently facinmg problem with value updation in Classification Tab for Batch Master. To create the classification I have marked BATCHCONTROLFIELDS-DOCLASSIFY = 'X' and to pass values to classification I have done the below coding.
wa_classvaluationschar-value_char = wa_itab-value1.
APPEND wa_classvaluationschar TO it_classvaluationschar.
wa_classvaluationsnum-value_from = wa_itab-value1.
wa_classvaluationsnum-unit_from = wa_itab-meins.
APPEND wa_classvaluationsnum TO it_classvaluationsnum.
Finally called the FM 'BAPI_BATCH_SAVE_REPLICA'. But , inspite of this I am not able to see values in the created Batch. Please advice on the same.
Thanks
-Trishna.Hi Trishna,
You need to update below mentioned fields to udpate that.
OBJECTKEY = concatenate (Internal Material number & plant & batch number)
OBJECTTABLE = MCHA
in both the structures CLASSALLOCATIONS and CLASSVALUATIONSCHAR. It will udpate.
Cheers,
Luri
Maybe you are looking for
-
CF Set form.variable for query and Next/Previous pages error
I have a CF form with a select that posts to a CF "action" page. On the action page I: CFSET ItemNumber=#form.ItemNumber# I CFOUTPUT the 'ItemNumber' into the CFQUERY (which is an Inner Join dependant on the '#ItemNumber#')... All of the above works
-
I had an issue over the weekend where my phone would overheat and show a blue screen. I would be on Skype video call and within 5 minutes, the phone would overheat, go from a streaked line to a full blue screen (anyone remember windows blue screen of
-
I have Tuch screen ipod and I forget the Secret number , I gessed and try to use different numbers but I could not Open it ( O preate) and then it get locked, and on the screen there is this words ( ipod is locked ,,, contact iTunes ) please help me
-
helpppp I dropped my iPhone 5 and it doesn't turn on it just shows the apple and says to plug into iTunes.. then i restore it and it says error , what do i do to get my phone to turn on fully ?
-
Hi, I have some data in Oracle table, which contains a column 'address'. I want to extract the values in this 'address' , transform the values, and then load it into another oracle table. During transformation, I want to standardize the values in the