Duplicate Records in Output
I have a script which works fine however what I cant seem to work out is why I get duplicate records in the output. If anyone knows why it would be a great help/
My code is below:
Connect-QADService "domain.com"
Function CheckUserExistance {
if(Get-QADUser -Identity $Global:Sam) {
Write-Host "User Found: $Global:Sam"
$Example = $Global:Sam + "2"
$prompt = "This User Already Exists, Please Choose Another Name. (e.g. $Example)"
$Title = "Error"
Add-Type -AssemblyName microsoft.visualbasic
$popup = [Microsoft.VisualBasic.interaction]::MsgBox($prompt,"OkOnly,Critical", $title)
if ($popup -eq "Ok"){
$prompt = "Please Enter The Name To Create (All Lowercase e.g john.smith):"
$Title = "User To Create"
Add-Type -AssemblyName microsoft.visualbasic
$Global:Sam = [Microsoft.VisualBasic.interaction]::inputbox($prompt,$title)
if ($Global:Sam -eq ""){exit}
# The following Function is used to randomly generate complex passwords.
function New-Password
param
[int]$length,
[switch]$lowerCase,
[switch]$upperCase,
[switch]$numbers,
[switch]$specialChars
BEGIN
# Usage Instructions
function Usage()
Write-Host ''
Write-Host 'FUNCTION NAME: New-Password' -ForegroundColor White
Write-Host ''
Write-Host 'USAGE'
Write-Host ' New-Password -length 10 -upperCase -lowerCase -numbers'
Write-Host ' New-Password -length 10 -specialChars'
Write-Host ' New-Password -le 10 -lo -u -n -s'
Write-Host ' New-Password'
Write-Host ''
Write-Host 'DESCRIPTION:'
Write-Host ' Generates a random password of a given length (-length parameter)'
Write-Host ' comprised of at least one character from each subset provided'
Write-Host ' as a switch parameter.'
Write-Host ''
Write-Host 'AVAILABLE SWITCHES:'
Write-Host ' -lowerCase : include all lower case letters'
Write-Host ' -upperCase : include all upper case letters'
Write-Host ' -numbers : include 0-9'
Write-Host ' -specialChars : include the following- !@#$%^&*()_+-={}[]<>'
Write-Host ''
Write-Host 'REQUIREMENTS:'
Write-Host ' You must provide the -length (four or greater) and at least one character switch'
Write-Host ''
function generate_password
if ($lowerCase)
$charsToUse += $lCase
$regexExp += "(?=.*[$lCase])"
if ($upperCase)
$charsToUse += $uCase
$regexExp += "(?=.*[$uCase])"
if ($numbers)
$charsToUse += $nums
$regexExp += "(?=.*[$nums])"
if ($specialChars)
$charsToUse += $specChars
$regexExp += "(?=.*[\W])"
$test = [regex]$regexExp
$rnd = New-Object System.Random
do
$pw = $null
for ($i = 0 ; $i -lt $length ; $i++)
$pw += $charsToUse[($rnd.Next(0,$charsToUse.Length))]
Start-Sleep -milliseconds 20
until ($pw -match $test)
return $pw
# Displays help
if (($Args[0] -eq "-?") -or ($Args[0] -eq "-help"))
Usage
break
else
$lCase = 'abcdefghijklmnopqrstuvwxyz'
$uCase = $lCase.ToUpper()
$nums = '1234567890'
$specChars = '!@#$%^&*()_+-={}[]<>'
PROCESS
if (($length -ge 4) -and ($lowerCase -or $upperCase -or $numbers -or $specialChars))
$newPassword = generate_password
else
Usage
break
$newPassword
END
Import-Csv "C:\ExternalADAccounts.csv" |
ForEach-Object {
$First = $_.Forename
$Last = $_.Surname
$Password = New-Password -length 8 -upperCase -lowerCase -numbers -specialChars
$stuff = $_.stuff
$OU = "domain.com/ou"
$UPN = "$First.$Last@$stuff.domain.com"
$Global:Sam = "$First.$Last"
$Display = "$First $Last"
$Global:Sam = $Global:Sam -Replace " ", ""
$Global:Sam = $Global:Sam -Replace "'",""
$Count = $Global:Sam.Length
If($Count -gt "19"){
$Global:Sam = $First[0] + "." + $Last
Do{CheckUserExistance}
While(Get-QADUser -Identity $Global:Sam)
Write-Host "Creating User: $Global:Sam"
New-QADUser -FirstName $First `
-LastName $Last `
-DisplayName $Display `
-Name $Global:Sam `
-SamAccountName $Global:Sam.ToLower() `
-UserPassword $Password `
-UserPrincipalName $UPN.ToLower() `
-ParentContainer $OU
do{
$Test = Get-QADUser -Identity $Global:Sam
} until(Get-QADUser -Identity $Global:Sam)
Set-QADUser $Global:Sam -Title 'Non-Managed'
} | Select-Object @{label="DisplayName";expression={$Display}},@{label="SamAccountName";expression={$Global:Sam}}, @{label="Password";expression={$Password}}, @{label="Stuff";expression={$Stuff}} |
Sort-Object $Stuff |
Export-Csv "C:\NonManagedCreated.csv" -NoTypeInformation
Any help is greatly appreciated.
James
Hi James,
Please try the script below, which will use Psobject to format the output:
$output=@()##########
Import-Csv "C:\ExternalADAccounts.csv" |
ForEach-Object {
do{
$Test = Get-QADUser -Identity $Global:Sam
} until(Get-QADUser -Identity $Global:Sam)
Set-QADUser $Global:Sam -Title 'Non-Managed'
###########output
$Object = New-Object -TypeName PSObject
$object | Add-Member -Name 'DisplayName' -MemberType Noteproperty -Value $DisplayName
$object | Add-Member -Name 'SamAccountName' -MemberType Noteproperty -Value $Global:Sam
$object | Add-Member -Name 'Password' -MemberType Noteproperty -Value $Password
$object | Add-Member -Name 'Stuff' -MemberType Noteproperty -Value $Stuff
$output+=$Object
$output|select DisplayName,SamAccountName,Password,Stuff| Sort-Object Stuff |Export-Csv "C:\NonManagedCreated.csv" -NoTypeInformation
If there s anything else regarding this issue, please feel free to post back.
Best Regards,
Anna Wang
Similar Messages
-
How to remove duplicates records from output ?
how to remove duplicates records from output ? i used delete adjacent but duplicates records are coming again ..suggest me
hi shruthi,
thanks for ur answer ..but duplicates records coming again
here is my code >> plz check it out
*& Report ZCRM_TROUBLE_TICKET
REPORT zcrm_trouble_ticket.
TYPES : BEGIN OF ty_qmih,
qmnum TYPE qmnum,
equnr TYPE equnr,
iloan TYPE iloan,
ausvn TYPE ausvn,
ausbs TYPE ausbs,
auztv TYPE auztv,
auztb TYPE auztb,
iwerk TYPE iwerk,
END OF ty_qmih,
BEGIN OF ty_qmel,
qmnum TYPE qmnum,
qmtxt TYPE qmtxt,
indtx TYPE indltx,
priok TYPE priok,
strmn TYPE strmn,
strur TYPE strur,
ltrmn TYPE ltrmn,
ltrur TYPE ltrur,
objnr TYPE qmobjnr,
arbpl TYPE lgwid,
vkorg TYPE vkorg,
vtweg TYPE vtweg,
spart TYPE spart,
END OF ty_qmel,
BEGIN OF ty_ihpa,
parnr TYPE i_parnr,
parvw TYPE parvw,
objnr TYPE qmobjnr,
END OF ty_ihpa,
BEGIN OF ty_crhd,
arbpl TYPE arbpl,
objid TYPE cr_objid,
END OF ty_crhd,
BEGIN OF ty_crtx,
ktext TYPE cr_ktext,
objid TYPE cr_objid,
END OF ty_crtx,
BEGIN OF ty_qmfe,
fecod TYPE fecod,
fegrp TYPE fegrp,
qmnum TYPE qmnum,
END OF ty_qmfe,
BEGIN OF ty_qmur,
urcod TYPE urcod,
urgrp TYPE urgrp,
urtxt TYPE urstx,
qmnum TYPE qmnum,
END OF ty_qmur,
BEGIN OF ty_iloa,
tplnr TYPE tplnr,
iloan TYPE iloan,
END OF ty_iloa,
BEGIN OF ty_output,
qmnum TYPE qmnum,
equnr TYPE equnr,
iloan TYPE iloan,
ausvn TYPE ausvn,
ausbs TYPE ausbs,
auztv TYPE auztv,
auztb TYPE auztb,
iwerk TYPE iwerk,
qmtxt TYPE qmtxt,
indtx TYPE indltx,
priok TYPE priok,
strmn TYPE strmn,
strur TYPE strur,
ltrmn TYPE ltrmn,
ltrur TYPE ltrur,
objnr TYPE qmobjnr,
arbpl TYPE lgwid,
vkorg TYPE vkorg,
vtweg TYPE vtweg,
spart TYPE spart,
parnr TYPE i_parnr,
parvw TYPE parvw,
arbpl TYPE arbpl,
objid TYPE cr_objid,
arbpl1 TYPE arbpl,
ktext TYPE cr_ktext,
fecod TYPE fecod,
fegrp TYPE fegrp,
urcod TYPE urcod,
urgrp TYPE urgrp,
urtxt TYPE urstx,
tplnr TYPE tplnr,
END OF ty_output.
DATA : it_qmih TYPE STANDARD TABLE OF ty_qmih,
it_qmel TYPE STANDARD TABLE OF ty_qmel,
it_ihpa TYPE STANDARD TABLE OF ty_ihpa,
it_crhd TYPE STANDARD TABLE OF ty_crhd,
it_crtx TYPE STANDARD TABLE OF ty_crtx,
it_qmfe TYPE STANDARD TABLE OF ty_qmfe,
it_qmur TYPE STANDARD TABLE OF ty_qmur,
it_iloa TYPE STANDARD TABLE OF ty_iloa,
it_output TYPE STANDARD TABLE OF ty_output,
wa_qmih TYPE ty_qmih,
wa_qmel TYPE ty_qmel,
wa_ihpa TYPE ty_ihpa,
wa_crhd TYPE ty_crhd,
wa_crtx TYPE ty_crtx,
wa_qmfe TYPE ty_qmfe,
wa_qmur TYPE ty_qmur,
wa_iloa TYPE ty_iloa,
wa_output TYPE ty_output.
INITIALIZATION.
REFRESH : it_qmih,
it_qmel,
it_ihpa,
it_crhd,
it_crtx,
it_qmfe,
it_qmur,
it_iloa,
it_output.
CLEAR: wa_qmih,
wa_qmel,
wa_ihpa,
wa_crhd,
wa_crtx,
wa_qmfe,
wa_qmur,
wa_iloa,
wa_output.
start-of-selection.
SELECT qmnum
equnr
iloan
ausvn
ausbs
auztv
auztb
iwerk
FROM qmih
INTO TABLE it_qmih.
SORT it_qmih BY qmnum .
DELETE ADJACENT DUPLICATES FROM it_qmih COMPARING qmnum equnr iloan ausvn ausbs auztv auztb iwerk.
SELECT qmnum
qmtxt
indtx
priok
strmn
strur
ltrmn
ltrur
objnr
arbpl
vkorg
vtweg
spart
FROM qmel
INTO TABLE it_qmel
FOR ALL ENTRIES IN it_qmih
WHERE qmnum = it_qmih-qmnum.
SORT it_qmel BY qmnum.
DELETE ADJACENT DUPLICATES FROM it_qmel COMPARING qmnum
qmtxt
indtx
strmn
strur
ltrmn
ltrur
objnr
arbpl
vkorg
vtweg
spart.
IF it_qmel IS NOT INITIAL.
SELECT parnr
parvw
objnr
FROM ihpa
INTO TABLE it_ihpa
FOR ALL ENTRIES IN it_qmel
WHERE objnr = it_qmel-objnr.
ENDIF.
DELETE ADJACENT DUPLICATES FROM it_ihpa COMPARING parnr
parvw
objnr.
IF it_qmel IS NOT INITIAL.
SELECT arbpl
objid
FROM crhd
INTO TABLE it_crhd
FOR ALL ENTRIES IN it_qmel
WHERE objid = it_qmel-arbpl.
ENDIF.
DELETE ADJACENT DUPLICATES FROM it_crhd COMPARING arbpl
objid.
IF it_qmel IS NOT INITIAL.
SELECT ktext
objid
FROM crtx
INTO TABLE it_crtx
FOR ALL ENTRIES IN it_crhd
WHERE objid = it_crhd-objid.
ENDIF.
DELETE ADJACENT DUPLICATES FROM it_crtx COMPARING ktext
objid.
IF it_qmih IS NOT INITIAL.
SELECT fecod
fegrp
qmnum
FROM qmfe
INTO TABLE it_qmfe
FOR ALL ENTRIES IN it_qmih
WHERE qmnum = it_qmih-qmnum.
ENDIF.
SORT it_qmfe BY qmnum.
DELETE ADJACENT DUPLICATES FROM it_qmfe COMPARING fecod
fegrp.
IF it_qmih IS NOT INITIAL.
SELECT urcod
urgrp
urtxt
qmnum
FROM qmur
INTO TABLE it_qmur
FOR ALL ENTRIES IN it_qmih
WHERE qmnum = it_qmih-qmnum.
ENDIF.
SORT it_qmur BY qmnum.
DELETE ADJACENT DUPLICATES FROM it_qmur COMPARING urcod
urgrp
urtxt.
IF it_qmih IS NOT INITIAL.
SELECT tplnr
iloan
FROM iloa
INTO TABLE it_iloa
FOR ALL ENTRIES IN it_qmih
WHERE iloan = it_qmih-iloan.
ENDIF.
DELETE ADJACENT DUPLICATES FROM it_iloa COMPARING tplnr
iloan.
LOOP AT it_qmih INTO wa_qmih.
wa_output-qmnum = wa_qmih-qmnum.
wa_output-equnr = wa_qmih-equnr.
wa_output-iloan = wa_qmih-iloan.
wa_output-ausvn = wa_qmih-ausvn.
wa_output-ausbs = wa_qmih-ausbs.
wa_output-auztv = wa_qmih-auztv.
wa_output-auztb = wa_qmih-auztb.
wa_output-iwerk = wa_qmih-iwerk.
APPEND wa_output TO it_output.
CLEAR wa_output.
READ TABLE it_qmel INTO wa_qmel WITH KEY qmnum = wa_qmih-qmnum.
wa_output-qmtxt = wa_qmel-qmtxt.
wa_output-indtx = wa_qmel-indtx.
wa_output-priok = wa_qmel-priok.
wa_output-strmn = wa_qmel-strmn.
wa_output-strur = wa_qmel-strur.
wa_output-ltrmn = wa_qmel-ltrmn.
wa_output-ltrur = wa_qmel-ltrur.
wa_output-objnr = wa_qmel-objnr.
wa_output-arbpl = wa_qmel-arbpl.
wa_output-vkorg = wa_qmel-vkorg.
wa_output-vtweg = wa_qmel-vtweg.
wa_output-spart = wa_qmel-spart.
APPEND wa_output TO it_output.
CLEAR wa_output.
READ TABLE it_ihpa INTO wa_ihpa WITH KEY objnr = wa_qmel-objnr.
wa_output-parnr = wa_ihpa-parnr.
wa_output-parvw = wa_ihpa-parvw.
APPEND wa_output TO it_output.
CLEAR wa_output.
READ TABLE it_crhd INTO wa_crhd WITH KEY objid = wa_qmel-arbpl.
wa_output-arbpl = wa_crhd-arbpl.
APPEND wa_output TO it_output.
CLEAR wa_output.
READ TABLE it_crtx INTO wa_crtx WITH KEY objid = wa_crhd-objid.
wa_output-ktext = wa_crtx-ktext.
APPEND wa_output TO it_output.
CLEAR wa_output.
READ TABLE it_qmfe INTO wa_qmfe WITH KEY qmnum = wa_qmih-qmnum.
wa_output-fecod = wa_qmfe-fecod.
wa_output-fegrp = wa_qmfe-fegrp.
APPEND wa_output TO it_output.
CLEAR wa_output.
READ TABLE it_qmur INTO wa_qmur WITH KEY qmnum = wa_qmih-qmnum.
wa_output-urcod = wa_qmur-urcod.
wa_output-urgrp = wa_qmur-urgrp.
wa_output-urtxt = wa_qmur-urtxt.
APPEND wa_output TO it_output.
CLEAR wa_output.
READ TABLE it_iloa INTO wa_iloa WITH KEY iloan = wa_qmih-iloan.
wa_output-tplnr = wa_iloa-tplnr.
APPEND wa_output TO it_output.
CLEAR wa_output.
ENDLOOP.
DELETE ADJACENT DUPLICATES FROM it_output COMPARING qmnum
equnr
ausvn
ausbs
auztv
auztb
iwerk
qmtxt
indtx
priok
strmn
strur
ltrmn
ltrur
vkorg
vtweg
spart
parnr
parvw
arbpl
ktext
fecod
fegrp
urcod
urgrp
urtxt
tplnr.
*CALL FUNCTION 'STATUS_TEXT_EDIT'
EXPORTING
CLIENT = SY-MANDT
FLG_USER_STAT = ' '
objnr =
ONLY_ACTIVE = 'X'
spras = en
BYPASS_BUFFER = ' '
IMPORTING
ANW_STAT_EXISTING =
E_STSMA =
LINE =
USER_LINE =
STONR =
EXCEPTIONS
OBJECT_NOT_FOUND = 1
OTHERS = 2
*IF sy-subrc <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
*ENDIF.
*CALL FUNCTION 'READ_TEXT'
EXPORTING
CLIENT = SY-MANDT
id =
language =
name =
object =
ARCHIVE_HANDLE = 0
LOCAL_CAT = ' '
IMPORTING
HEADER =
tables
lines =
EXCEPTIONS
ID = 1
LANGUAGE = 2
NAME = 3
NOT_FOUND = 4
OBJECT = 5
REFERENCE_CHECK = 6
WRONG_ACCESS_TO_ARCHIVE = 7
OTHERS = 8
*IF sy-subrc <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
*ENDIF.
*LOOP AT IT_OUTPUT INTO WA_OUTPUT.
*WRITE : / WA_OUTPUT-qmnum,
WA_OUTPUT-equnr,
WA_OUTPUT-iloan,
WA_OUTPUT-ausvn,
WA_OUTPUT-ausbs,
WA_OUTPUT-auztv,
WA_OUTPUT-auztb,
WA_OUTPUT-qmtxt,
WA_OUTPUT-indtx,
WA_OUTPUT-strmn,
WA_OUTPUT-strur,
WA_OUTPUT-ltrmn,
WA_OUTPUT-ltrur,
WA_OUTPUT-objnr,
WA_OUTPUT-arbpl,
WA_OUTPUT-parnr,
WA_OUTPUT-parvw,
WA_OUTPUT-objid,
WA_OUTPUT-ktext,
WA_OUTPUT-fecod,
WA_OUTPUT-fegrp,
WA_OUTPUT-urcod,
WA_OUTPUT-urgrp,
WA_OUTPUT-urtxt,
WA_OUTPUT-tplnr.
*ENDLOOP.
CALL FUNCTION 'GUI_DOWNLOAD'
EXPORTING
BIN_FILESIZE =
filename = 'E:\CRM1.TXT'
FILETYPE = 'ASC'
APPEND = ' '
write_field_separator = '|'
HEADER = '00'
TRUNC_TRAILING_BLANKS = ' '
WRITE_LF = 'X'
COL_SELECT = ' '
COL_SELECT_MASK = ' '
DAT_MODE = ' '
CONFIRM_OVERWRITE = ' '
NO_AUTH_CHECK = ' '
CODEPAGE = ' '
IGNORE_CERR = ABAP_TRUE
REPLACEMENT = '#'
WRITE_BOM = ' '
TRUNC_TRAILING_BLANKS_EOL = 'X'
WK1_N_FORMAT = ' '
WK1_N_SIZE = ' '
WK1_T_FORMAT = ' '
WK1_T_SIZE = ' '
WRITE_LF_AFTER_LAST_LINE = ABAP_TRUE
IMPORTING
FILELENGTH =
TABLES
data_tab = it_output
FIELDNAMES =
EXCEPTIONS
FILE_WRITE_ERROR = 1
NO_BATCH = 2
GUI_REFUSE_FILETRANSFER = 3
INVALID_TYPE = 4
NO_AUTHORITY = 5
UNKNOWN_ERROR = 6
HEADER_NOT_ALLOWED = 7
SEPARATOR_NOT_ALLOWED = 8
FILESIZE_NOT_ALLOWED = 9
HEADER_TOO_LONG = 10
DP_ERROR_CREATE = 11
DP_ERROR_SEND = 12
DP_ERROR_WRITE = 13
UNKNOWN_DP_ERROR = 14
ACCESS_DENIED = 15
DP_OUT_OF_MEMORY = 16
DISK_FULL = 17
DP_TIMEOUT = 18
FILE_NOT_FOUND = 19
DATAPROVIDER_EXCEPTION = 20
CONTROL_FLUSH_ERROR = 21
OTHERS = 22
IF sy-subrc <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF. -
Duplicate Records generating for infocube.
hi all,
when I load the data from datasource to infocube I am getting the duplicate records.Actually the data is loaded into the datasource from a flat file when i executed first time it worked but when i changed the flat file structure i am not getting the modified content in the infocube instead it is showing the duplicates.In the data source 'preview data' option it is showing the required data i.e modified flat file) .But where as in the infocube i made all the necessary changes in the datasource,infocube,infopackage,dtp but still I am getting the duplicates. I even deleted the data in the infocube.Still i am getting the duplicates. What is the ideal solution for this problem ? One way is to create a new data source with the modified flat file but I think it is not ideal .Then what is the possible solution with out creating the data source again.
Edited by: dharmatejandt on Oct 14, 2010 1:46 PM
Edited by: dharmatejandt on Oct 14, 2010 1:52 PM
Edited by: dharmatejandt on Oct 14, 2010 1:59 PMFinally i got it .I deleted the requestids in the infopackage ( right click infopackage go to manage) then i executed the transformation ,dtp finally i got the required output with out duplicate.
Edited by: dharmatejandt on Oct 14, 2010 4:05 PM -
Duplicate records in TABLE CONTROL
Hi folks,
i am doing a module pool where my internal table (itab) data is comming to table ontrol(ctrl).then i need to select one record in table control & then i press REFRESH push button.
after putting the refresh button, some new records are comming to that same internal table.then i need to display the modified internal table (some new records are added) data in the table control.
The modified internal table data is comming to the table control but to the last of table control, some records are repeating.
before comming to table control, i checked the modified itab. it contains correct data.i.e it contains 15 records.(previously i have 5 records.after REFRESH button 10 more records are added.). but when this table is comming to table control, it contains some 100 record.i should get only 15 record.
why these records r repeting. how to delete the duplicate records from table control?
plz suggest me where i am doing mistake.
correct answer will be rewarded
Thanks & RegardsHi ,
Thanks for ur help. but i should not refresh the internal table as some records r already present.after putting the REFRESH button, some new records r appending to this existing table.then i am going to display the previous records & the new records as well.
i checked the internal table after modification.it contains actual number of records. but after comming to table control , more records r comming.
is this the problem with scrolling or waht?
plz suggest where i am doing mistake.i am giving my coding below.
PROCESS BEFORE OUTPUT.
MODULE STATUS_0200.
module tc_shelf_change_tc_attr.
loop at object_tab1
with control tablctrl
cursor tablctrl-current_line.
module tc_shelf_get_lines.
endloop.
PROCESS AFTER INPUT.
module set_exit AT EXIT-COMMAND.
loop at object_tab1.
chain.
field: object_tab1-prueflos,
object_tab1-matnr.
module shelf_modify on chain-request.
endchain.
field object_tab1-idx
module shelf_mark on request.
endloop.
module shelf_user_command.
module user_command_0200.
***INCLUDE Y_RQEEAL10_STATUS_0200O01 .
*& Module STATUS_0200 OUTPUT
text
MODULE STATUS_0200 OUTPUT.
SET PF-STATUS 'MAIN'.
SET TITLEBAR 'xxx'.
ENDMODULE. " STATUS_0200 OUTPUT
*& Module tc_shelf_change_tc_attr OUTPUT
text
MODULE tc_shelf_change_tc_attr OUTPUT.
delete adjacent duplicates from object_tab1 comparing prueflos matnr.
describe table object_tab1 lines tablctrl-lines.
ENDMODULE. " tc_shelf_change_tc_attr OUTPUT
*& Module tc_shelf_get_lines OUTPUT
text
MODULE tc_shelf_get_lines OUTPUT.
data: g_tc_shelf_lines like sy-loopc.
if tablctrl-current_line > tablctrl-lines.
stop.
endif.
g_tc_tablctrl_lines = sy-loopc.
*refresh control tablctrl from screen 0200.
ENDMODULE. " tc_shelf_get_lines OUTPUT
***INCLUDE Y_RQEEAL10_SHELF_MODIFYI01 .
*& Module shelf_modify INPUT
text
MODULE shelf_modify INPUT.
modify object_tab1
index tablctrl-current_line.
ENDMODULE. " shelf_modify INPUT
*& Module set_exit INPUT
text
module set_exit INPUT.
leave program.
endmodule. " set_exit INPUT
*& Module shelf_mark INPUT
text
MODULE shelf_mark INPUT.
data: g_shelf_wa2 like line of object_tab1.
if tablctrl-line_sel_mode = 1
and object_tab1-idx = 'X'.
loop at object_tab1 into g_shelf_wa2
where idx = 'X'.
g_shelf_wa2-idx = ''.
modify object_tab1
from g_shelf_wa2
transporting idx.
endloop.
endif.
modify object_tab1
index tablctrl-current_line
transporting idx plnty plnnr plnal.
ENDMODULE. " shelf_mark INPUT
*& Module shelf_user_command INPUT
text
MODULE shelf_user_command INPUT.
ok_code = sy-ucomm.
perform user_ok_tc using 'TABLCTRL'
'OBJECT_TAB1'
changing ok_code.
sy-ucomm = ok_code.
ENDMODULE. " shelf_user_command INPUT
*& Module user_command_0100 INPUT
text
MODULE user_command_0200 INPUT.
data:v_line(3).
case OK_CODE.
when 'LAST'.
read table object_tab1 with key idx = 'X'.
if sy-subrc = 0.
select * from qals
where enstehdat <= object_tab1-enstehdat
and plnty ne space
and plnnr ne space
and plnal ne space.
if sy-dbcnt > 0.
if qals-enstehdat = object_tab1-enstehdat.
check qals-entstezeit < object_tab1-entstezeit.
move-corresponding qals to object_tab2.
append object_tab2.
else.
move-corresponding qals to object_tab2.
append object_tab2.
endif.
endif.
endselect.
sort object_tab2 by enstehdat entstezeit descending.
loop at object_tab2 to 25.
if not object_tab2-prueflos is initial.
append object_tab2 to object_tab1.
endif.
clear object_tab2.
endloop.
endif.
when 'SAVE'.
loop at object_tab1 where idx = 'X'.
if ( not object_tab1-plnty is initial and
not object_tab1-plnnr is initial and
not object_tab1-plnal is initial ).
select single * from qals into corresponding fields of wa_qals
where prueflos = object_tab1-prueflos.
if sy-subrc = 0.
wa_qals-plnty = object_tab1-plnty.
wa_qals-plnnr = object_tab1-plnnr.
wa_qals-plnal = object_tab1-plnal.
update qals from wa_qals.
if sy-subrc <> 0.
Message E001 with 'plan is not assigned to lot in sap(updation)'.
else.
v_line = tablctrl-current_line - ( tablctrl-current_line - 1 ).
delete object_tab1.
endif.
endif.
endif.
endloop.
when 'BACK'.
leave program.
when 'NEXT'.
call screen 300.
ENDCASE.
***INCLUDE Y_RQEEAL10_USER_OK_TCF01 .
*& Form user_ok_tc
text
-->P_0078 text
-->P_0079 text
<--P_OK_CODE text
form user_ok_tc using p_tc_name type dynfnam
p_table_name
changing p_ok_code like sy-ucomm.
data: l_ok type sy-ucomm,
l_offset type i.
search p_ok_code for p_tc_name.
if sy-subrc <> 0.
exit.
endif.
l_offset = strlen( p_tc_name ) + 1.
l_ok = p_ok_code+l_offset.
case l_ok.
when 'P--' or "top of list
'P-' or "previous page
'P+' or "next page
'P++'. "bottom of list
perform compute_scrolling_in_tc using p_tc_name
l_ok.
clear p_ok_code.
endcase.
endform. " user_ok_tc
*& Form compute_scrolling_in_tc
text
-->P_P_TC_NAME text
-->P_L_OK text
form compute_scrolling_in_tc using p_tc_name
p_ok_code.
data l_tc_new_top_line type i.
data l_tc_name like feld-name.
data l_tc_lines_name like feld-name.
data l_tc_field_name like feld-name.
field-symbols <tc> type cxtab_control.
field-symbols <lines> type i.
assign (p_tc_name) to <tc>.
concatenate 'G_' p_tc_name '_LINES' into l_tc_lines_name.
assign (l_tc_lines_name) to <lines>.
if <tc>-lines = 0.
l_tc_new_top_line = 1.
else.
call function 'SCROLLING_IN_TABLE'
exporting
entry_act = <tc>-top_line
entry_from = 1
entry_to = <tc>-lines
last_page_full = 'X'
loops = <lines>
ok_code = p_ok_code
overlapping = 'X'
importing
entry_new = l_tc_new_top_line
exceptions
others = 0.
endif.
get cursor field l_tc_field_name
area l_tc_name.
if syst-subrc = 0.
if l_tc_name = p_tc_name.
set cursor field l_tc_field_name line 1.
endif.
endif.
<tc>-top_line = l_tc_new_top_line.
endform. " COMPUTE_SCROLLING_IN_TC
Thanks -
Remove duplicate records in the Footer
Hello all:
I only want to print the final result(distinct records) in the footer section, therefore, the header and details sections are suppressed. However, I'm showing duplicate records in the footer section. How do I suppress the duplicate records?
Please help.....Thank you so much in advance
Here are my formulas :
Header
WhilePrintingRecords;
StringVar ConCat:=""
Details
WhilePrintingRecords;
StringVar Concat;
ConCat := Concat + Trim(ToText({MEDICATE.STARTDATE},"MM/dd/yyyy") + chr(7) + {MEDICATE.DESCRIPTION}) + chr(13)
Footer
WhilePrintingRecords;
StringVar ConCat;
Here's my desired output:
HUMALOG PEN 100 UNIT/ML SOLN
LANTUS 100 U/ML SOLN
METFORMIN HCL 1000 MG TABS
Below is an example my current output:
11/24/2009-HUMALOG PEN 100 UNIT/ML SOLN
11/24/2009-HUMALOG PEN 100 UNIT/ML SOLN
11/24/2009-HUMALOG PEN 100 UNIT/ML SOLN
11/24/2009-LANTUS 100 U/ML SOLN
11/24/2009-LANTUS 100 U/ML SOLN
11/24/2009-LANTUS 100 U/ML SOLN
01/24/2007-METFORMIN HCL 1000 MG TABS
01/24/2007-METFORMIN HCL 1000 MG TABS
01/24/2007-METFORMIN HCL 1000 MG TABS
11/24/2009-HUMALOG PEN 100 UNIT/ML SOLN
11/24/2009-HUMALOG PEN 100 UNIT/ML SOLN
11/24/2009-HUMALOG PEN 100 UNIT/ML SOLN
11/24/2009-LANTUS 100 U/ML SOLN
11/24/2009-LANTUS 100 U/ML SOLN
11/24/2009-LANTUS 100 U/ML SOLN
01/24/2007-METFORMIN HCL 1000 MG TABS
01/24/2007-METFORMIN HCL 1000 MG TABS
01/24/2007-METFORMIN HCL 1000 MG TABS
11/24/2009-HUMALOG PEN 100 UNIT/ML SOLN
11/24/2009-HUMALOG PEN 100 UNIT/ML SOLN
11/24/2009-HUMALOG PEN 100 UNIT/ML SOLN
11/24/2009-LANTUS 100 U/ML SOLN
11/24/2009-LANTUS 100 U/ML SOLN
11/24/2009-LANTUS 100 U/ML SOLN
01/24/2007-METFORMIN HCL 1000 MG TABS
01/24/2007-METFORMIN HCL 1000 MG TABS
01/24/2007-METFORMIN HCL 1000 MG TABSSorry, I forgot to mention I'm already grouping by Patient Name, and that's where I have created a formula called:
MEDSHEAD - which I've modified and added the concat line.
WhilePrintingRecords;
//StringVar ConCat:="";
StringVar ConCat;
ConCat := Concat + Trim(ToText({MEDICATE.STARTDATE},"MM/dd/yyyy") + chr(7) + {MEDICATE.DESCRIPTION}) + chr(13);
In the detail section : I'm displaying the the patient's name, DOB and Gender and the hemoglabin lab result and date of the lab result.
In the patient's footer section is where I have the formula called: medsfooter (WhilePrintingRecords; StringVar Concat; Concat;)
The changes that I made, eliminate some of the duplicate records, however, it's no longer initializing the values - what I meant is I am getting the previous patient's medication information:
See output below:
Patient One 4/25/1958 F
6/26/09 12.2
9/2/2008-Glipizide XL 5 MG TB24
4/2/2009-Novolog 100 unit/ml soln
Patient Two 12/11/45 F
9/2/2008-Glipizide XL 5 MG TB24
4/2/2009-Novolog 100 unit/ml soln
11/24/2009-Humalog Pen 100 Unit/ML soln -
Duplicate records in SQ00 query
HI,
I am trying to make a query joining 3 tables. All tables have atleast 1 key field in common. In the output I have all fields from the main table, 3 fields from table "B" with a left outer join and 1 field from table "C" with an inner join. I am recieving duplicate records in the output. Is this issue a result of incorrect join conditions or is something else needed. Below is an illustration of join conditions.
I look forward to your responses
Regards,
CurtCurt - is this related to BI?
SQ00 is an ECC transaction and you may have better luck posting in one of the ERP spaces -
Duplicate records in database view for ANLA and ANLC tables
HI all,
Can any one please suggest me how to remove duplicate records from ANLA and ANLC tables when creating a database view.
thanks in advance,
ben.Hi,
Suppose we have two tables one with one field and another with two fields:
TAB1 - Key field KEY1
TAB2 - Key fields KEY1 & Key 2.
No if we create a Database view of these two tables we can do by joining these two tables on Key field KEY1.
Now if in View tab we have inculded TAB1- Key1.
Now lets suppose following four entries are in table TAB1: (AAA), (BBB), (CCC).
and following entries are in table TAB2: (AAA, 1), (AAA, 2), (BBB, 3), (BBB, 5), (DDD, 3).
The data base view will show following entries:
AAA,
AAA,
BBB,
BBB,
Now these entris are duplicate in the output.
This is because TAB2 has multilple entries for same key value of TAB1.
Now if we want to remove multiple entries from ouput - we need to include an entry in selection conditions like TAB2-KEY2 = '1'.
Regards,
Pranav. -
Array permitting duplicate record
Hi,
I have string array like below:
String strArr[] = {"style1","style2","style2","style4"};
In this string array index 1 and index 2 is having same value which is style2.
My target is to generate key(system time) for each value as below:
"key1" : "style1"
"key2" : "style2"
"key2" : "style2" //Here I want to reuse already generated key2 in place of creating new key like key3
"key4" : "style4 "
At last generated array should be:
["key1" : "style1",
"key2" : "style2",
"key2" : "style2" , //Here I want to retain this duplicate record
"key4" : "style4 "]
Value position is also important, like below array is not accepted:
["key2" : "style2",
"key1" : "style1",
"key2" : "style2" ,
"key4" : "style4 "]
I am struggling to achieve this.
Please guide me to achieve this!
Regards
My Efforts:
static Map<String, String> m = new HashMap<String, String>();
static int[] intarr;
static List<Integer> list = new ArrayList<Integer>();
public static void main(String[] args) throws InterruptedException {
String strArr[] = {"style1","style2","style2","style4"};
/**Find Index having duplicate value*/
for(int i=0; i<strArr.length; i++){
for(int j=0; j<strArr.length; j++){
if((strArr==strArr[j])&&(i!=j)){
list.add(i);
String newclass = "class"+new Date().getTime();//
//Thread.sleep(1000);//
m.put(newclass, strArr[i]);//
break;
for(int i : list){
//System.out.println("Duplicate value found at Index:: "+i);
/*String newduplicateclass = "class"+new Date().getTime();
Thread.sleep(1000);
for(int i=0; i<strArr.length; i++){
if(list.contains(i)){
System.out.println("i="+i);
m.put(newduplicateclass, strArr[i]);
}else{
String newclass = "class"+new Date().getTime();
m.put(newclass, strArr[i]);
Thread.sleep(1000);
System.out.println(m);
But want to do with smartest way using few lines of code.
Edited by: 898087 on Mar 17, 2012 3:16 AM@TPD Thanks for your help! It gave me a speed up.
At last I am done with what I wanted and below is that code:
public static void main(String[] args){
String strArr[] = {"style7","style1","style5","style7","style2","style1","style2","style7","style2","style5","style7","style7"};
Set<String> set = new HashSet<String>();
for(int i=0; i<strArr.length; i++){
set.add(strArr);
Object objArr[] = set.toArray();
int counter = 0;
Map<String, Integer> map = new HashMap<String, Integer>();
for(int j=0; j<objArr.length; j++){
map.put(objArr[j].toString(), counter);
counter++;
JSONArray jsonArr = new JSONArray();
JSONObject jsonObj;
for(int i=0; i<strArr.length; i++){
jsonObj = new JSONObject();
jsonObj.put(strArr[i], map.get(strArr[i]));
jsonArr.add(jsonObj);
System.out.println(jsonArr);
//OUTPUT: [{"style7":0},{"style1":2},{"style5":1},{"style7":0},{"style2":3},{"style1":2},{"style2":3},{"style7":0},{"style2":3},{"style5":1},{"style7":0},{"style7":0}] -
How to handel duplicate record by bcp command
Hi All,
I`m using BCP to import ASCII data text into a table that already has many records. BCP failed because of `Duplicate primary key`.
Now, is there any way using BCP to know precisely which record whose primary key caused that `violation of inserting duplicate key`.
I already used the option -O to output error to a `error.log`, but it doesn`t help much, because that error log contains the same error message mentioned above without telling me exactly which record so that I can pull that `duplicate record`
out of my import data file.
TIA and you have a great day.The only way of figuring out what PKs conflicted I know of is to load the data to a different table and then run an INNER JOIN select between the two.
BCP.exe is not part of SSIS technically, don't know why you are posting in this section of the forum.
Arthur My Blog -
How to remove duplicate records? Please help!
^
Dear Friends
I am new comer to this forum and had a simple SQL question. I would appreciate if anyone can help.
Many Thanx!
[email protected]
The Question:
Suppose we have some interviewers who specify the times they are available. At each interview 2 interviewers must be present.
A relation is set up to contain the available times for each interviewer, which has these fields: interviewerID, date, time. For example:
1, 14/12/04, 9:00-10:00
1, 14/12/04, 12:00-1:00
2, 12/11/04, 11:00-12:00
etc.
Then how can we write a SQL query that produces the times during which 2 interviewers are available?
A Possible Answer:
The following query produces the times during which 2 interviewers are available, however it has duplicate records:
SELECT a1.interviewerID, a2.interviewerID, a1.date, a1.time
FROM AvailabileTimes as a1, AvailableTimes as a2
WHERE a1.date = a2.date AND a1.time = a2.time AND a1.interviewerID <> a2.interviewerID
An example of the output is:
1, 2, 12/12/04, 10:00-11:00
1, 3, 14/10/03, 12:00-1:00
2, 1, 12/12/04, 10:00-11:00
etc.
where line 3 is a duplicate of line 1 (only the place of interviewerID is changed)Thanx scott for your quick reply!!
this solves the problem for now, and is a very good idea!
But I was told this might have problems if interviewID field is later changed to something that contains letters instead of a pure numeric value e.g. staffAB-01-ZC.
Do you know what I should do to have a long term solution? -
SQL Query to retrieve one line from duplicate records
Hi
I have one table which contains duplicate records in multiple column but the difference is in one column which contains the value 0 or positive. The query i want is to retrieve only the line with the positive value for only the duplicated records.
here below a sample data for your reference:
CREATE TABLE TRANS
CALLTRANSTYPE NVARCHAR2(6),
ORIGANI NVARCHAR2(40),
TERMANI NVARCHAR2(40),
STARTTIME DATE,
STOPTIME DATE,
CELLID NVARCHAR2(10),
CONNECTSECONDS NUMBER,
SWITCHCALLCHARGE NUMBER
INSERT INTO TRANS VALUES ('REC','555988801','222242850',to_date('05/15/2012 09:15:00','mm/dd/yyyy hh24:mi:ss'),to_date('05/15/2012 09:15:25','mm/dd/yyyy hh24:mi:ss'),null,25,0)
INSERT INTO TRANS VALUES ('REC','555988801','222242850',to_date('05/15/2012 09:15:00','mm/dd/yyyy hh24:mi:ss'),to_date('05/15/2012 09:15:25','mm/dd/yyyy hh24:mi:ss'),null,25,18000)
INSERT INTO TRANS VALUES ('REC','555988801','222242850',to_date('05/15/2012 09:18:03','mm/dd/yyyy hh24:mi:ss'),to_date('05/15/2012 09:18:20','mm/dd/yyyy hh24:mi:ss'),null,17,0)
The output i want to have is:
CALLTRANSTYPE ORIGANI TERMANI STARTTIME STOPTIME CELLID CONNECTSECONDS SWITCHCALLCHARGE
REC 555988801 222242850 05/15/2012 09:15:00 05/15/2012 09:15:25 25 18000
REC 555988801 222242850 05/15/2012 09:18:03 05/15/2012 09:18:20 17 0 Thank you.Hi ekh
this is the query i want to have, thank you for the help:
SQL> Select *from
select CALLTRANSTYPE,ORIGANI,TERMANI,STARTTIME,STOPTIME,CELLID,CONNECTSECONDS,SWITCHCALLCHARGE
,row_number() over( partition by STARTTIME ,STOPTIME order by SWITCHCALLCHARGE DESC ) rn from TRANS
where rn=1;
CALLTR ORIGANI TERMANI STARTTIME STOPTIME CELLID CONNECTSECONDS SWITCHCALLCHARGE RN
REC 555988801 222242850 15-MAY-12 15-MAY-12 25 18000 1
REC 555988801 222242850 15-MAY-12 15-MAY-12 17 0 1Regrads
Lucienot. -
Hide Duplicate records of Key Fields (SAP BEX 7)
Need Advice.
On BEX 7, I want to hide repeated/duplicate records in KF object(formula) for the output when being converted into spreadsheet.What i did was i just modify the query properties and uncheck the 'Hide Repeated Key Values' on Display Option. But the fields that were affected by this changes are those only on the characteristic field. Duplicate records on Key Fields are still visible. I can't find any other way on hiding those duplicate records. Below will show sample output:
Defect:
Plan Owner | Plan ID | Status | Plan Comment | Total Plan Count
A | ID001 | Active | Test A | 15
A | ID001 | Active | Test B | 15
A | ID001 | Active | Test C | 15
Modified:
Plan Owner | Plan ID | Status | Plan Comment | Total Plan Count
A | ID001 | Active | Test A | 15
_ | _____| _____ | Test B | 15
_ | _____| _____ | Test C | 15
Only records on the characteristic field were hidden.
the value 15 still exist for all 'ROWS' after exporting into spreadsheet.
Question: Is it "POSSIBLE" to hide as well those duplicate records on the KF?
Thanks in Advance!Hi,
I also faced similar issue where i was getting duplicate keyfigures values usually it happens when you make a query on infoset or you do a lookup on another dso to get the value.
You may try with a code to delete adjacent duplicate records keeping a check on your characterstic.
Hope it helps.
Regards,
AL
Edited by: AL1112 on Sep 8, 2011 12:36 PM -
Duplicate records in master data info object
dear friends,
i got a standard infoobject called 'A' and it has got some attributes B,C,D,E for that one standard datasource with the following fields A,B,C,D,E exists and i loaded data. abapers had created a z table with the following P,Q,R,S,X,Y,Z fields. 'P' holds the same records what the 'A' infoobject holds. my requirement is to create a report on the following fields:
P,Q,R,S,B,C,D,E
WAT I DONE IS I CREATED A GENERIC DATASOURCE FOR THE FOLLOWING FIELDS P,Q,R,S AND I ADDED THE STANDARD INFOOBJECT CALLED 'A' WITH THE FOLLOWING FIELDS: Q,R,S.
AND I CRTEATED AND SHEDULED THE INFOPACKAGE FOR THE STANDARD DATASOURCE(A,B,C,D,E,) TO STANDARD INFOOBJECT(A,B,C,D,E,Q,R,S). NEXT I CREATED THE ANOTHER INFOPACKAGE AND SHEDULED THE INFOPACKAGE AND SHEDULED FROM GENERIC DATASOURCE (P,Q,R,S) TO STANDARD INFOOBJECT(A,B,C,D,E,P,Q,R,S) WITH TRANSFER RULES P->A,Q->Q,R->R,S->S. AFTER LOADING THE DATA I AM GETTING DUPLICATE RECORDS.I WILL GIVE THE TABLE HOW MY MASTER DATA LOOKS LIKE
A B C D E P Q R S
1 2 3 4 5
2 3 4 5 6
3 4 5 6 7
1 6 7 8 9
2 7 8 9 3
3 4 6 2 1
THIS IS HOW MY MASTERDATA LOOKS LIKE BUT I NEED IN THE FOLLLOWING FORMAT:
A B C D E P Q R S
1 2 3 4 5 6 7 8 9
2 3 4 5 6 7 8 9 3
3 4 5 6 7 4 6 2 1
PLEASE LET ME KNOW
THANKS & REGARDS,
HARIHari,
why don't you enhance the Masterdata info object?. You are suppose to see overwritten records. infoobject A is primary key in the table.
try to enhance the masterdata Datasource. you will get required output or create masterdat generic Data source.
All the best.
any questions let us know.
Nagesh. -
Duplicate records at a particular time
Hi, i am getting duplicate records at a particular point of time could you please help me with this regard.
The drive in which I installed informatica ran out of disc space. So I found this in the error log SF_34125 Error in writing storage file [C:\Informatica\9.0.1\server\infa_shared\Storage\pmservice_Domain_ssintr01_INT_SSINTR01_1314615470_0.dat]. System returns error code [errno = 28], error message [No space left on device]. Then I tried to shut down the integration service and then freeup some space on the disc. I got the following message in the log file LM_36047 Waiting for all running workflows to complete.SF_34014 Service [INT_SSINTR01] on node [node01_ssintr01] shut down. Then when I tried to start the integration service again, I got the following error Could not execute action... The Service INT_SSINTR01 could not be enabled due to the following error: [DOM_10079] Unable to start service [INT_SSINTR01] on any node specified for the service After this I am not able to find any entry in the log file for the integration service. So I went to the domain log to find out more details. I found these in the domain log DOM_10126 Request to disable [SERVICE] [INT_SSINTR01] in [COMPLETE] mode.DOM_10130 Stop service process for [SERVICE] [INT_SSINTR01] on node [node01_ssintr01].LIC_10040 Service [INT_SSINTR01] is stopping on node [node01_ssintr01].SPC_10015 Request to stop process for service [INT_SSINTR01] with mode [COMPLETE] on node [node01_ssintr01].DOM_10127 Request to disable service [INT_SSINTR01] completed.DOM_10126 Request to disable [SERVICE] [Repo_SSINTR01] in [ABORT] mode.DOM_10130 Stop service process for [SERVICE] [Repo_SSINTR01] on node [node01_ssintr01].LIC_10042 Repository instance [Repo_SSINTR01] is stopping on node [node01_ssintr01].SPC_10015 Request to stop process for service [Repo_SSINTR01] with mode [ABORT] on node [node01_ssintr01].DOM_10127 Request to disable service [Repo_SSINTR01] completed.DOM_10115 Request to enable [service] [Repo_SSINTR01].DOM_10117 Starting service process for service [Repo_SSINTR01] on node [node01_ssintr01].SPC_10014 Request to start process for service [Repo_SSINTR01] on node [node01_ssintr01].SPC_10018 Request to start process for service [Repo_SSINTR01] was successful.SPC_10051 Service [Repo_SSINTR01] started on port [6,019] successfully.DOM_10118 Service process started for service [Repo_SSINTR01] on node [node01_ssintr01].DOM_10121 Selecting a primary service process for service [Repo_SSINTR01].DOM_10120 Service process on node [node01_ssintr01] has been set as the primary node of service [Repo_SSINTR01].DOM_10122 Request to enable service [Repo_SSINTR01] completed.LIC_10041 Repository instance [Repo_SSINTR01] has started on node [node01_ssintr01].DOM_10115 Request to enable [service] [INT_SSINTR01].DOM_10117 Starting service process for service [INT_SSINTR01] on node [node01_ssintr01].SPC_10014 Request to start process for service [INT_SSINTR01] on node [node01_ssintr01].DOM_10055 Unable to start service process [INT_SSINTR01] on node [node01_ssintr01].DOM_10079 Unable to start service [INT_SSINTR01] on any node specified for the service.DOM_10126 Request to disable [SERVICE] [INT_SSINTR01] in [COMPLETE] mode.DOM_10130 Stop service process for [SERVICE] [INT_SSINTR01] on node [node01_ssintr01].LIC_10040 Service [INT_SSINTR01] is stopping on node [node01_ssintr01].SPC_10015 Request to stop process for service [INT_SSINTR01] with mode [COMPLETE] on node [node01_ssintr01].DOM_10127 Request to disable service [INT_SSINTR01] completed.DOM_10126 Request to disable [SERVICE] [Repo_SSINTR01] in [ABORT] mode.DOM_10130 Stop service process for [SERVICE] [Repo_SSINTR01] on node [node01_ssintr01].LIC_10042 Repository instance [Repo_SSINTR01] is stopping on node [node01_ssintr01].SPC_10015 Request to stop process for service [Repo_SSINTR01] with mode [ABORT] on node [node01_ssintr01].DOM_10127 Request to disable service [Repo_SSINTR01] completed.DOM_10115 Request to enable [service] [Repo_SSINTR01].DOM_10117 Starting service process for service [Repo_SSINTR01] on node [node01_ssintr01].SPC_10014 Request to start process for service [Repo_SSINTR01] on node [node01_ssintr01].SPC_10018 Request to start process for service [Repo_SSINTR01] was successful.SPC_10051 Service [Repo_SSINTR01] started on port [6,019] successfully.DOM_10118 Service process started for service [Repo_SSINTR01] on node [node01_ssintr01].DOM_10121 Selecting a primary service process for service [Repo_SSINTR01].DOM_10120 Service process on node [node01_ssintr01] has been set as the primary node of service [Repo_SSINTR01].DOM_10122 Request to enable service [Repo_SSINTR01] completed.LIC_10041 Repository instance [Repo_SSINTR01] has started on node [node01_ssintr01].DOM_10115 Request to enable [service] [INT_SSINTR01].DOM_10117 Starting service process for service [INT_SSINTR01] on node [node01_ssintr01].SPC_10014 Request to start process for service [INT_SSINTR01] on node [node01_ssintr01].DOM_10055 Unable to start service process [INT_SSINTR01] on node [node01_ssintr01].DOM_10079 Unable to start service [INT_SSINTR01] on any node specified for the service.Then I tried shutting down the domain and restarting the informatica service again. I got the following error when the Integration service was initializedDOM_10115 Request to enable [service] [INT_SSINTR01].DOM_10117 Starting service process for service [INT_SSINTR01] on node [node01_ssintr01].SPC_10014 Request to start process for service [INT_SSINTR01] on node [node01_ssintr01].SPC_10009 Service process [INT_SSINTR01] output [Informatica(r) Integration Service, version [9.0.1], build [184.0604], Windows 32-bit].SPC_10009 Service process [INT_SSINTR01] output [Service [INT_SSINTR01] on node [node01_ssintr01] starting up.].SPC_10009 Service process [INT_SSINTR01] output [Logging to the Windows Application Event Log with source as [PmServer].].SPC_10009 Service process [INT_SSINTR01] output [Please check the log to make sure the service initialized successfully.].SPC_10008 Service Process [INT_SSINTR01] output error [ERROR: Unexpected condition at file:[..\utils\pmmetrics.cpp] line:[2118]. Application terminating. Contact Informatica Technical Support for assistance.].SPC_10012 Process for service [INT_SSINTR01] terminated unexpectedly.DOM_10055 Unable to start service process [INT_SSINTR01] on node [node01_ssintr01].DOM_10079 Unable to start service [INT_SSINTR01] on any node specified for the service. I tried creating a new integration service and associating it with the same repository. Even then I got the same error. So I tried creating a new repository and a new integration service. Even then I got the same error. What might be the workaround to start the integration service?
-
How to suppress duplicate records in rtf templates
Hi All,
I am facing issue with payment reason comments in check template.
we are displaying payment reason comments. Now the issue is while making batch payment we are getting multiple payment reason comments from multiple invoices with the same name and it doesn't looks good. You can see payment reason comments under tail number text field in the template.
If you provide any xml syntax to suppress duplicate records for showing distinct payment reason comments.
Attached screen shot, template and xml file for your reference.
Thanks,
Sagar.I have CRXI, so the instructions are for this release
you can create a formula, I called it cust_Matches
if = previous () then 'true' else 'false'
IN your GH2 section, right click the field, select format field, select the common tab (far left at the top)
Select the x/2 to the right of Supress in the formula field type in
{@Cust_Matches} = 'true'
Now every time the {@Cust_Matches} is true, the CustID should be supressed,
do the same with the other fields you wish to hide. Ie Address, City, etc.
Maybe you are looking for
-
Which file(s) stores the edits in a session?
I just opened FCPX this morning and found the multicam edits I had done a couple of days ago had gone from the timeline. It seemed to have reverted back a week or so. The backup copy also seemed to be at that state. Annoying to say the least, but hav
-
Http.Sys Bad Request - Url
Hi, I have a virtual directory hosted in 2008 server R2 iis with Directory Browsing enabled. The virtual directory has sub-folder named : ים לכב When I try to browse this sub-folde i get 400 bad request. The request is being rejected within http.sys
-
Can a waveform graph's display parameters be changed in an exe from outside that exe?
Hi all, I have a kind of strange Q to post here: I've got a quite stable test execution system still based on LV 7.11. Amoung others it has a waveform graph that shows curves for different test measurements. All be done in the exe, mostly via Refnums
-
No adding buddies option.
I can't add new buddies. Why is that? I can't see the + sign on my buddy list and i also went to Buddies-Add Buddy, and i can't click on it. I was thinking on reinstalling the app, but i'm not home and i don't have the CDs with me. What should i do?
-
Hi. I have a death adder mouse i recently bought for my mac, 10.4.11, and it stopped working. No matter which mouse i use, it doesn't work. It just stays in this one spot about 200 pixels from the left and 50 from the top. When i move it violently, i