BDC program - avoid duplicate records
Hello Experts,
I m doing BDC .My query is not working
My req is if company code,Pur Org.,A/c Grp & name1 is same as in the database i.e. if vendor having same these 4 fields in the databse as in the excel, this particular record will not get uploaded .
In it_success is the internal table with records to be uploaded after some validations.
PLz check where i m wrong
loop at i_lfa1.
loop at i_lfb1 where lifnr = i_lfa1-lifnr.
loop at i_lfm1 where lifnr = i_lfb1-lifnr.
main_table-lifnr = i_lfa1-lifnr.
main_table-bukrs = i_lfb1-bukrs.
main_table-ekorg = i_lfm1-ekorg.
main_table-KTOKK = i_lfa1-KTOKK.
main_table-name1 = i_lfa1-name1.
append main_table.
endloop.
endloop.
endloop.
loop at it_success.
loop at main_table where ( bukrs = it_success-bukrs_001
and EKORG = it_success-EKORG_002
and KTOKK = it_success-KTOKK_003
and name1 = it_success-name1_006 ).
MOVE-CORRESPONDING it_success TO it_error.
APPEND it_error.
delete main_table.
if sy-subrc eq 0.
delete it_success.
e_fret-type = 'E'.
e_fret-name = it_error-NAME1_006.
e_fret-message = 'Vendor Name already exists '.
append e_fret.
endif.
endloop.
endloop.
Ravi
Edited by: Julius Bussche on Oct 1, 2008 9:30 AM
Please use meaningfull subject titles.
Hi,
loop at i_lfa1.
loop at i_lfb1 where lifnr = i_lfa1-lifnr.
read table i_lfm1 with key lifnr = i_lfb1-lifnr.
if sy-subrc = 0.
main_table-ekorg = i_lfm1-ekorg.
endif.
main_table-lifnr = i_lfa1-lifnr.
main_table-bukrs = i_lfb1-bukrs.
main_table-KTOKK = i_lfa1-KTOKK.
main_table-name1 = i_lfa1-name1.
append main_table.
endloop.
endloop.
Try Above code.
Thanks,
Durai.V
Similar Messages
-
hi guys
could u pls let me know where is the option for avoiding duplicate records?
1. in case of Info package
2.In case of DTP?Hi,
Incase of infopackage in 3.5 - > Processing tab -> select only PSA ,update subsequent data tagets,ignore double data records
in 7.0 processing tab by default the selection is only PSA
Incase of DTP - >update tab -> select handle duplicate data records. -
How to avoid Duplicate Records while joining two tables
Hi,
I am trying to join three tables, basically two tables are same one is like history table, so I wrote a query like
select
e.id,
e.seqNo,
e.name,
d.resDate,
d.details
from employees e,
((select * from dept)union(select * from dept_hist)) d
join on d.id=e.id and e.seqno=d.seqno
but this returing duplicate records.
Could anyone please tell me how to avoid duplicate records of this query.Actually it is like if the record is processed it will be moved to hist table, so both table will not have same records and I need the record from both the tables so i have done the union of both the tables, so d will have the union of both records.
But I am getting duplicate records if even I am distinct. -
Avoiding duplicate records while inserting into the table
Hi
I tried the following insert statement , where i want to avoid the duplicate records while inserting itself
but giving me the errror like invalid identifier, though the column exists in the table
Please let me know Where i'm doing the mistake.
INSERT INTO t_map tm(sn_id,o_id,txt,typ,sn_time)
SELECT 100,
sk.obj_id,
sk.key_txt,
sk.obj_typ,
sysdate,
FROM S_KEY sk
WHERE sk.obj_typ = 'AY'
AND SYSDATE BETWEEN sk.start_date AND sk.end_date
AND sk.obj_id IN (100170,1001054)
and not exists (select 1
FROM t_map tm1 where tm1.O_ID=tm.o_id
and tm1.sn_id=tm.sn_id
and tm1.txt=tm.txt
and tm1.typ=tm.typ
and tm1.sn_time=tm.sn_time )Then
you have to join the table with alias tml where is that ?do you want like this?
INSERT INTO t_map tm(sn_id,o_id,txt,typ,sn_time)
SELECT 100,
sk.obj_id,
sk.key_txt,
sk.obj_typ,
sysdate,
FROM S_KEY sk
WHERE sk.obj_typ = 'AY'
AND SYSDATE BETWEEN sk.start_date AND sk.end_date
AND sk.obj_id IN (100170,1001054)
and not exists (select 1
FROM t_map tm where sk.obj_ID=tm.o_id
and 100=tm.sn_id
and sk.key_txt=tm.txt
and sk.obj_typ=tm.typ
and sysdate=tm.sn_time ) -
Sqlloader controlfileparam's to avoid duplicate records loading
Hi All,
I am trying for a option in control file which should restrict sql loader to load duplicate records.I know if we apply a constraint on table itself, it wont allow you to load duplicate data and reject into reject file but in controlfile i have to do it to make sql loader to avoid loading duplicate records.Can you please suggest me which option in controlfile will enable this.
Can you please also inform me the exact difference between badfile and reject file.In what scenarios it will write to reject or to badfile
RegardsHey
i dont think there is any option to avoid loading duplicate records by using a parameter in the control file
On the difference between the bad and reject files try this link
http://www.exforsys.com/content/view/1587/240/
Regards,
Sushant -
How to avoid duplicate record in a file to file
Hi Guys,
Could you please provide a soultion
in order to avoid duplicate entries in a flat file based on key field.
i request in terms of standard functions
either at message mappingf level or by configuring the file adapter.
warm regards
mahesh.hi mahesh,
write module processor for checking the duplicate record in file adapter
or
With a JAVA/ABAP mapping u can eliminate the duplicate records
and check this links
Re: How to Handle this "Duplicate Records"
Duplicate records
Ignoring Duplicate Records--urgent
Re: Duplicate records frequently occurred
Re: Reg ODS JUNK DATA
http://help.sap.com/saphelp_nw2004s/helpdata/en/d0/538f3b294a7f2de10000000a11402f/frameset.htm
regards
srinivas -
How to create a BDC program for given recording.
i have to create a BDC program for uploading a file.
currently i am using call function
CALL FUNCTION 'F4_FILENAME'just check out the code below it is for updating two transactions
types Declaration *
types: begin of t_tab1 ,
vendor(10),
material(18),
pur_org(4),
wglif(18),
end of t_tab1.
Data Declaration *
data : begin of it_tab5 occurs 0,
vendor(10),
material(18),
end of it_tab5.
**DATA : BEGIN OF IT_TAB6 OCCURS 0,
VENDOR(10),
MATERIAL(18),
END OF IT_TAB6.
data: it_tab1 type standard table of t_tab1 with header line.
data: wa_tab1 type t_tab1.
data: wa_tab2 type t_tab1.
data: it_tab3 like bdcdata occurs 0 with header line.
data: it_tab4 like bdcdata occurs 0 with header line.
data: it_tab2 type table of bdcmsgcoll with header line.
data: d_file_name like ibipparms-path,
d_file_name1 type string.
Start-of-selection *
start-of-selection.
FM for finding the flat file
call function 'F4_FILENAME'
EXPORTING
PROGRAM_NAME = SYST-CPROG
DYNPRO_NUMBER = SYST-DYNNR
FIELD_NAME = ' '
importing
file_name = d_file_name.
d_file_name1 = d_file_name.
******FM for uploading data from flat file into internal table
call function 'GUI_UPLOAD'
exporting
filename = d_file_name1
filetype = 'ASC'
HAS_FIELD_SEPARATOR = ' '
HEADER_LENGTH = 0
READ_BY_LINE = 'X'
DAT_MODE = ' '
CODEPAGE = ' '
IGNORE_CERR = ABAP_TRUE
REPLACEMENT = '#'
CHECK_BOM = ' '
IMPORTING
FILELENGTH =
HEADER =
tables
data_tab = it_tab5
exceptions
file_open_error = 1
file_read_error = 2
no_batch = 3
gui_refuse_filetransfer = 4
invalid_type = 5
no_authority = 6
unknown_error = 7
bad_data_format = 8
header_not_allowed = 9
separator_not_allowed = 10
header_too_long = 11
unknown_dp_error = 12
access_denied = 13
dp_out_of_memory = 14
disk_full = 15
dp_timeout = 16
others = 17.
if sy-subrc <> 0.
message id sy-msgid type sy-msgty number sy-msgno
with sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
endif.
End-of-selection *
end-of-selection.
sort it_tab5 by vendor.
loop at it_tab5.
perform bdc_dynpro using 'SAPLBG00' 0101 .
perform bdc_field using 'BDC_OKCODE' '/00' .
perform bdc_dynpro using 'SAPLBG00' 1000 .
perform bdc_field using 'BDC_OKCODE' '=BUCH' .
perform bdc_field using 'BDC_SUBSCR'
'SAPLBG00 1101KOPF_1000'.
perform bdc_field using 'GBGMK-GAART' '2' .
perform bdc_field using 'GBGMK-GAERB' 'X' .
perform bdc_field using 'BDC_SUBSCR'
'SAPLBG00 1103TAB_SUB_1000'.
perform bdc_field using 'BDC_CURSOR' 'GBGMP-LSTNR(01)' .
perform bdc_field using 'GBGMP-LSTNR(01)' it_tab5-material .
call transaction 'BGM1' using it_tab3 mode 'E' messages into it_tab2
refresh it_tab3.
wa_tab1-pur_org = 'ABCP'.
loop at it_tab2.
endloop.
perform bdc_dynpro using 'SAPMM06I' 0100 .
perform bdc_field using 'BDC_CURSOR' 'EINE-EKORG' .
perform bdc_field using 'BDC_OKCODE' '/00' .
perform bdc_field using 'EINA-LIFNR' it_tab5-vendor .
perform bdc_field using 'EINA-MATNR' it_tab5-material .
perform bdc_field using 'EINE-EKORG' wa_tab1-pur_org .
perform bdc_field using 'RM06I-NORMB' 'X' .
perform bdc_dynpro using 'SAPMM06I' 0101 .
perform bdc_field using 'BDC_CURSOR' 'EINA-WGLIF' .
perform bdc_field using 'BDC_OKCODE' '=BU' .
perform bdc_field using 'EINA-WGLIF' it_tab2-msgv1 .
call transaction 'ME12' using it_tab3 mode 'E'.
refresh it_tab3.
refresh it_tab2.
endloop.
*& Form BDC_DYNPRO
Start new screen
-->P_FNAM text
-->P_FVAL text
form bdc_dynpro using program
dynpro.
clear it_tab3.
it_tab3-program = program.
it_tab3-dynpro = dynpro.
it_tab3-dynbegin = 'X'.
append it_tab3.
endform. " BDC_DYNPRO
*& Form BDC_FIELD
Insert field
-->P_FNAM text
-->P_FVAL text
form bdc_field using fnam
fval.
clear it_tab3.
it_tab3-fnam = fnam.
it_tab3-fval = fval.
append it_tab3.
endform. " BDC_FIELD
<REMOVED BY MODERATOR>
Edited by: Alvaro Tejada Galindo on Apr 14, 2008 6:20 PM -
Oracle 10 - Avoiding Duplicate Records During Import Process
I have two databases on different servers(DB1&DB2) and a dblink connecting the two. In DB2, I have 100 million records in table 'DB2Target'.
I tried to load 100 million more records from tables DB1Source to DB2Target on top of existing 400 million records but after an hour I got a network error from DB2.
The load failed after inserting 70% records.Now I have three tasks. First I have to find the duplicate record between DB1 and DB2. Second I have to find out the remaining 30% records missing from DB2Target.
Third I have to re-load the remaining 30% records. What is the best solution?
SELECT COUNT(*), A, B FROM DB2TARGET
GROUP BY A, B
HAVING COUNT(*) > 2
re-loading
MERGE INTO DB2TARGET tgt
USING DB1SOURCE src
ON ( tgt .A= tgt .A)
WHEN NOT MATCHED THEN
INSERT ( tgt.A, tgt .B)
VALUES ( src .A, src .B)Thanks for any guidance.when I execute this I get the folllowing error message:
SQL Error: ORA-02064: distributed operation not supported
02064. 00000 - "distributed operation not supported"
*Cause: One of the following unsupported operations was attempted
1. array execute of a remote update with a subquery that references
a dblink, or
2. an update of a long column with bind variable and an update of
a second column with a subquery that both references a dblink
and a bind variable, or
3. a commit is issued in a coordinated session from an RPC procedure
call with OUT parameters or function call.
*Action: simplify remote update statement -
Scenario - Webservice - XI - BW. How to Avoid duplicate records?
Hi all,
Webservice --> XI -->BW .
BPM has been used to send to send the response back.
BPM :
start ->Receive(Request)> Transformation(Responsemap)>Send(SendtoBW)->Send(Send Response) ---> stop.
We are making use of MSGID to maintain the uniqueness of each message which is coming from Webservice. Uniqueness is maintained for the combination of sale-num:trans-num:sap-saletype:sale-type like below. One msgID will be registered in XI for each Unique message.
ex: sale-num:trans-num:sap-saletype:sale-type
1983:5837:E:NEW
If they receive any duplicate message again it will send the response back to webservice as "DUPLICATE_GUID:c9237200-0c69-2a80-dd11-79d5b47b213a".*
It is working correctly. But only problem is when XI is down or if any communication failure happens in the middle of the processing like below example.
Sample example which has failed recently. A webservice call has been failed three times and the reasons are..
First time :
It got the error as ""FAILED TO INVOKE WEB SERVICE OPERATION OS_CWUSales
Error receiving Web Service Response: Fatal Error: csnet read operation failed (No such file or directory) (11773)" .
Second time:
MessageExpiredException: Message c9237200-0c69-2a80-dd11-79d5b47b213a(OUTBOUND) expired.
Third Time :
"DUPLICATE_GUID:c9237200-0c69-2a80-dd11-79d5b47b213a" ""
If you observe when the call made 2nd time, the MsgID has been registered but due to server down or some other reason it could not able to process further.So MSGID got registered here but processing wasn't happened for the message. When they retried thrid time again to send the same call again they are getting error as "DUPLICATE GUID".
DUPLICATE GUID means it has meaning that the message has been processed and the records has been updated in the backend system which is not happened now.
Final Result is :
Status in Webservice showing as "it has been updated in receicing system" as it is indicating as duplicate guid.
- But it has not updated in backend system which is a problem.
Firstly is there any suggestions on how to solve this problem?
Is there any better way to handle this duplicate thing instead of Msgid?
Please help me in solving this.
Thanks & Regards
Deepthi.
Edited by: deepthi reddy on Jan 7, 2009 2:07 AM
Edited by: deepthi reddy on Jan 7, 2009 2:07 AM>> My suggestion: You can have a Webservice - BW synch-synch synario without BPM. Sender SOAP adapter sending a synch req msg, getting it mapped to BW req and getting its response back from BW, then map it to webservice response and send to webservice without any receiver SOAP adapter.
Thanks for the suggestion . Looks like a good Idea.
>> Regarding the problem of duplicate check: see when your BW system gets req msg, it processes it and then sends a response msg.........in this response message have a STATUS field with values S for success and E for error - send this response to webservice and store it in database for that req MSGID ........now in web service application, check the response code for a MSGID and if it is E for error, then only resend the msg to BW.
Initially they have planned the same way. But sometimes the response is getting very late back from BW. So now once the request reached to XI then immediately they are sending back the response from XI itself withhardcoded "OK" status. They are not waiting for BAPI response from BW.
So If message is succesfull the Status will go as "OK"
If message is not succesfull the status will go with blank. and this will indicate that there is some problem in the other side. Then they will check the SOAP Fault message which will have the errors like
"DUPLICATE_GUID:c9237200-0c69-2a80-dd11-79d5b47b213a".
"FAILED TO INVOKE WEB SERVICE OPERATION "
Right now they are having the issue only for duplicate data and response time to go back. So by making use of MsgID they solved the Issue.But due to this we are seeing daily so many error messages failing with this DUPLICATE error in production which is hampering the performance.
So we are thinking of to get rid of this MsgId method and BPM. So from your first answer I think we can able to acheive it w/o BPM and after that I need to check the response time how fast it will go back to webservice.
-Deepthi. -
Avoiding duplicate records in report
Hi All,
I have a scenario where
Delivery document gets created in R/3 say on 7/1 with Act GI date "#" and KFs are all "0". This gets loaded into BI.
On "7/5" this is PGId and the status in R/3 changesto ACT GI date "7/5" and Qty of "100" . This when loaded into BI is getting published as dupicate records i.e.
Del doc Created date Act GI Del. Ind Qty
12345 1-Jul # # 0
12345 1-Jul 5-Jul # 100
Please note that the data is getting loaded from DSO into Infocube and DSO is in overwrite mode.
Any suggestions to overcome this problem.Is ACT GI date a keyfield in the DSO ?
If yes, data will not be overwritten and two records will be loaded into the Cube.
Make ACT GI date a datafield which will result in only one record 12345 1-Jul 5-Jul # 100 as the keyfield values are same.
Firstly make sure if this is right for all business scenarios. -
BI statistics, Duplicate records in ST03N
Hi Friends,
We had applied BI statistics, and now we are checking the query performance in ST03n.
In the Reporting Analysis View, I can monitor the query access, the problem is that this view show de statistical data saved in 0TCT_C02 (infocube) and 0TCT_VC02 (virtual cube, the the entries that the view displar, are duplicated.
How can I solve this?
Thanks in advance!Hi,
Please implement the OSS Note:
1401235: Avoid Duplicate records and handling of Virtual Cube call
-Vikram -
Avoid duplicate standard receipe qty
Dear All,
I have found one query when i am making one report. In C203 t.code we can see product receipe. Generally receipe group is only one for one product but in some products i have found two receipe group like 5....100 & 5...200 and it is ok and it happens.
Now i need to fetch standard qty for input materials vs process order qty for input materials. so currently i can fetch two receipe group like 0001...820 for one receipe group and 0001...820 for second receipe group but i need only one receipe group qty. currently it seems double standard qty against process order qty because BOM no(STLNR) is same for both receipe group.
I can also see in COR3 t.code in master data tab, there is defined particular receipe group like 5...100. and this effect we see in AFKO table. But mainly i need std.qty of receipe so i have found STAS,STKO and STPO table.In STPO table i can see std.qty of input materials and in STKO we can see Product no and its batch size. STLAL field in STAS table and also in STKO but noy in STPO for linking purpose. Now in STPO i can see like,
STLNR IDNRK Qty
00000639 0001...820 50
00000639 0001...820 50
In my report std.qty comes 100 but i want 50 qty because i have not ound any link to filter one BOM no.(STLNR).
Is there any other tables that i can search or what to do.
Regards,
Shivam.Hi! shivam pastagia
u can use delete adjacent syntax to avoid duplicate records in internal table.
STLNR IDNRK Qty
00000639 0001...820 50
00000639 0001...820 50
sort itab by stlnr idrnk etc..
DELETE ADJACENT DUPLICATES FROM itab comparing stlnr idrnk tetc...
Regards,
Mohammed Rasul.S -
Can anyone tell me the flow of BDC program?
Hi
Can anyone tell me the flow of BDC program and also what is the significance of each step?
Thanks
GaganNot sure what you are asking here. A BDC program is a recording over an SAP transaction. The flow is determined by the screen sequence of the transaction which is being recorded.
What you do is fill an internal table with screen numbers and fields with values and fcodes. Then call the transaction using the internal table.
For example, the following program calls PA30 and enters a time event.
report zrich_0001
no standard page heading.
parameters: p_pernr type pa0002-pernr,
p_plans type t528b-plans.
data: mode type c value 'N'.
data: bdcdata type table of bdcdata with header line.
data: messtab type table of bdcmsgcoll with header line.
start-of-selection.
perform do_transaction using 'P10'
p_pernr
p_plans.
* FORM do_transaction *
form do_transaction using timev
pernr
plans.
data: bdcdate(10) type c,
bdctime(8) type c.
call function 'CONVERT_DATE_TO_EXTERNAL'
exporting
date_internal = sy-datum
importing
date_external = bdcdate.
write sy-uzeit to bdctime.
clear bdcdata. refresh bdcdata.
clear messtab. refresh messtab.
perform bdc_dynpro using 'SAPMP50A' '1000'.
perform bdc_field using 'BDC_OKCODE'
'=INS'.
perform bdc_field using 'RP50G-PERNR'
pernr.
perform bdc_field using 'RP50G-TIMR6'
'X'.
perform bdc_field using 'BDC_CURSOR'
'RP50G-CHOIC'.
perform bdc_field using 'RP50G-CHOIC'
'2011'.
perform bdc_dynpro using 'MP200000' '2500'.
perform bdc_field using 'BDC_CURSOR'
'T705H-GTEXT'.
perform bdc_field using 'BDC_OKCODE'
'=DIFP'.
perform bdc_field using 'P2011-LDATE'
bdcdate.
perform bdc_field using 'P2011-LTIME'
bdctime.
perform bdc_field using 'P2011-SATZA'
timev.
if timev = 'P10'.
perform bdc_dynpro using 'MP200000' '2221'.
perform bdc_field using 'BDC_CURSOR'
'P2APL-PLANS'.
perform bdc_field using 'BDC_OKCODE'
'=DOIT'.
perform bdc_field using 'P2APL-OTYPE'
'A'.
perform bdc_field using 'P2APL-PLANS'
plans.
perform bdc_field using 'P2APL-WAERS'
'USD'.
endif.
perform bdc_dynpro using 'MP200000' '2500'.
perform bdc_field using 'BDC_CURSOR'
'P2011-LDATE'.
perform bdc_field using 'BDC_OKCODE'
'=UPD'.
call transaction 'PA30' using bdcdata
mode mode
messages into messtab.
* If error occurs, give message and come out.
if sy-subrc <> 0.
endif.
clear bdcdata. refresh bdcdata.
clear messtab. refresh messtab.
endform.
* bdc_dynpro
form bdc_dynpro using program dynpro.
clear bdcdata.
bdcdata-program = program.
bdcdata-dynpro = dynpro.
bdcdata-dynbegin = 'X'.
append bdcdata.
endform.
* bdc_field
form bdc_field using fnam fval.
clear bdcdata.
bdcdata-fnam = fnam.
bdcdata-fval = fval.
append bdcdata.
endform.
Regards,
Rich Heilman -
Need help with an SHDB BDC program for Change outbound delivery(VL02N).
I have created recording to change outbound delivery(VL02N). Steps are as below-
For VL02N recording 1st I have click on the header(F8) then dates tab.
Then insert line (+ button) then it shows 8 transport types.
I have chosen 7th transport type. In SHDB it shows BDC_CURSOR = '08/07'.
Then I have created BDC program for this recording, but it's not working,
because It is changing BDC_CURSOR value every time when we do SHDB or VL02N and in my code I have hard coded BDC_CURSOR = '08/07' .
Can anyone tell me how to get this BDC_CURSOR changed value. So that instead of hard coding this value I can select this value every time.
(FYI For this Screen name = SAPMSSY0 Screen No = 0120.)
Thanks.I have created recording to change outbound delivery(VL02N). Steps are as below-
For VL02N recording 1st I have click on the header(F8) then dates tab.
Then insert line (+ button) then it shows 8 transport types.
I have chosen 7th transport type. In SHDB it shows BDC_CURSOR = '08/07'.
Then I have created BDC program for this recording, but it's not working,
because It is changing BDC_CURSOR value every time when we do SHDB or VL02N and in my code I have hard coded BDC_CURSOR = '08/07' .
Can anyone tell me how to get this BDC_CURSOR changed value. So that instead of hard coding this value I can select this value every time.
(FYI For this Screen name = SAPMSSY0 Screen No = 0120.)
Thanks. -
Recording TCode for BDC program
Hi ALL,
I have a problem in recording GS01 tcode.
I have to a table control in GS01 tcode.
After some rows again i need to update the data.
How can i record this and write BDC program for the same.Hi
To uplaod the data using the table control,if number of line are more then screen.
Then use the P+ in u r code.
let assume that u can see 10 rows on screen.once the count reachs the 10 then do P+ then
new line will come and after P+ clear u r counter.
Maybe you are looking for
-
Can't install a trial version of RoboHelp 11.
There's always an error at the Initializing stage. It tells me to restart and turn off firewalls, etc., but I can't turn off Windows Defender.
-
SAP version 4.7 + Thai localize T-Code : F110S Business place field is missing in T-code F110s. This lead to business place in payment document which created from F110s have business place as same as invoice. Would you please recommend where user can
-
Hello... I'm trying to stream a generated pdf file (jasperreports) to the response in an AbstractPortalComponent. The Code looks like follows... public class HelloWorld extends AbstractPortalComponent JasperPrint jprint; JasperReport report;
-
Deploying libraries containing VI names that are invalid as filenames
When using "Deploy TestStand System" I get the following error: "Error Code:1052 Could not process LabVIEW VIs. Fix any broken VIs before rebuilding. LabVIEW error: C:\TestStand Deployment\Image\target\SupportVIs\Guardian 1000 Status?.vi" The problem
-
/src/core/pkg is missing
Im trying to install Arch Linux, In the installation process, during step one i am selecting the source. it says 'package directory /src/core/pkg is missing´. what does that mean? the installation then fails. what´s the problem?