Duplicate value in export file
Hi,
I have a sql report where I have implemented download to excel using the print attribute. When I export the report into Excel, there are few columns whose values getting duplicated in the spreadsheet where as those column values are blank in the report.
for example.
Query:
select col1 col2 col3 col4 from table1;
The report displays the correct value for col1 col2 col3 col4 but in the export file value for col1, col2, col3 are correct but col4 is picking up the value of col3.
Any help? Thanks.
--Manish
Hi Scott,
Could you please have a look at it.
http://apex.oracle.com/pls/otn/f?p=47427:12
When you will click download_excel at the bottom of the page, the export file will show you the duplicate values for the column [G-L].
Thanks again for your help.
--Manish
Message was edited by:
Manish Jha
Similar Messages
-
Unable to Enforce Unique Values, Duplicate Values Exist
I have list in SP 2010, it contains roughly 1000 items. I would like to enforce unique values on the title field. I started by cleaning up the list, ensuring that all items already had a unique value. To help with this, I used the export
to excel action, then highlight duplicates within Excel. So as far as I can tell, there are no duplicates within that list column.
However, when I try to enable the option to Enforce Unique Values, I receive the error that duplicate values exist within the field and must be removed.
Steps I've taken so far to identify / resolve duplicate values:
- Multiple exports to Excel from an unfiltered list view, then using highlight duplicates feature > no duplicates found
- deleted ALL versions of every item from the list (except current), ensured they were completely removed by deleting from both site and site collection recycle bins
- Using the SP Powershell console, grabbed all list items and exported all of the "Title" type fields (Item object Title, LinkTitle, LinkTitleNoMenu, etc) to a csv and ran that through excel duplicate checking as well.
Unless there's some rediculous hidden field value that MS expects anyone capable of attempting to enforce unique values on a list (which is simple enough for anyone to figure out - if it doesn't throw an error), then I've exhausted anything I can think
of that might cause the list to report duplicate values for that field.
While I wait to see if someone else has an idea, I'm also going to see what happens if I wipe the Crawl Index and start it from scratch.
- JonFirst, I create index for a column in list settings, it works fine no matter duplicate value exists or not;
then I set enforce unique values in the field, after click OK, I get duplicate values error message.
With SQL Server profiler, I find the call to proc_CheckIfExistingFieldHasDuplicateValues and the parameters. After reviewing this stored procedure in content database,
I create the following script in SQL Server management studio:
declare @siteid
uniqueidentifier
declare @webid
uniqueidentifier
declare @listid
uniqueidentifier
declare @fieldid
uniqueidentifier
set @siteid='F7C40DC9-E5D3-42D7-BE60-09B94FD67BEF'
set @webid='17F02240-CE04-4487-B961-0482B30DDA84'
set @listid='B349AF8D-7238-419D-B6C4-D88194A57EA7'
set @fieldid='195A78AC-FC52-4212-A72B-D03144DC1E24'
SELECT
* FROM TVF_UserData_List(@ListId)
AS U1 INNER
MERGE JOIN
NameValuePair_Latin1_General_CI_AS
AS NVP1 WITH (INDEX=NameValuePair_Latin1_General_CI_AS_MatchUserData)
ON NVP1.ListId
= @ListId AND NVP1.ItemId
= U1.tp_Id
AND ((NVP1.Level
= 1 AND U1.tp_DraftOwnerId
IS NULL)
OR NVP1.Level
= 2)
AND NOT((DATALENGTH(ISNULL(NVP1.Value,
= 0)) AND U1.tp_Level
= NVP1.Level
AND U1.tp_IsCurrentVersion
= CONVERT(bit, 1)
AND U1.tp_CalculatedVersion
= 0 AND U1.tp_RowOrdinal
= 0 INNER
MERGE JOIN
NameValuePair_Latin1_General_CI_AS
AS NVP2 WITH (INDEX=NameValuePair_Latin1_General_CI_AS_CI)
ON NVP2.SiteId
= @SiteId AND NVP2.ListId
= @ListId AND NVP2.FieldId
= @FieldId AND NVP2.Value
= NVP1.Value
AND NVP2.ItemId <> NVP1.ItemId
CROSS APPLY TVF_UserData_ListItemLevelRow(NVP2.ListId, NVP2.ItemId,
NVP2.Level, 0)
AS U2 WHERE ((NVP2.Level
= 1 AND U2.tp_DraftOwnerId
IS NULL)
OR NVP2.Level
= 2)
AND NOT((DATALENGTH(ISNULL(NVP2.Value,
= 0))
I can find the duplicate list items based on the result returned by the query above.
Note that you need to change the parameter values accordingly, and change the name of NameValuePair_Latin1_General1_CI_AS table based on the last parameter of the
proc_CheckIfExistingFieldHasDuplicateValues stored procedure. You can review the code of this stored procedure by yourself.
Note that direct operation on the content database in production environment is not supported, please do all these in test environment. -
Recovery from bad export file - IMP0009 error
Hi all,
I have a large export that is split into 2Gb chunks using FILESIZE option at export .. string is something like this:
exp userid=xxx/xxx filesize=2048M file=bu${DATE}_01.dmp,bu${DATE}_02.dmp,b
u${DATE}_03.dmp full=y log=bu${DATE}.log
..export log showed no errors but on import I'm getting IMP0009 : abnormal end of export file
Typically there's a rather large and important table that's split between the first and 2nd files which I really need to get at. If I import at the moment it rolls back the first 1.1 million odd rows before it fails and it wont load the 2nd file at all, says something about the sequence of the file being incorrect.
OS is AIX 4.3.3
Oracle is 8.1.7.0.0
Any ideas how I can recover the table that is split accross files? I've tried catting the files together to form one large one (and import on an OS without file system size limits) .. no dice although perhaps I can alter it to fool it into thinking it's 1 file??
Thanks in desparation!
AdamUsername:
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Export file created by EXPORT:V08.01.07 via conventional path
import done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set
export client uses US7ASCII character set (possible charset conversion)
export server uses US7ASCII NCHAR character set (possible ncharset conversion)
IMP-00046: using FILESIZE value from export file of 2147483648
. importing C45966's objects into C45966
"CREATE TABLE "CUST_LEDGER_BALANCE" ("HOUSE" CHAR(6) NOT NULL ENABLE, "CUST""
" CHAR(2) NOT NULL ENABLE, "SERVICE_GROUP" NUMBER(2, 0) NOT NULL ENABLE, "LE"
"DGER_SEQ" NUMBER(10, 0) NOT NULL ENABLE, "LDGRDATE" DATE NOT NULL ENABLE, ""
"STMT_DATE" DATE NOT NULL ENABLE, "LDTYPE" NUMBER(2, 0) NOT NULL ENABLE, "AM"
"OUNT" NUMBER(12, 0) NOT NULL ENABLE, "DECIMAL_CNT" NUMBER(1, 0) NOT NULL EN"
"ABLE, "SUMCODE" CHAR(3) NOT NULL ENABLE, "LEDGER_STATUS" NUMBER(1, 0) NOT N"
"ULL ENABLE, "PROG" NUMBER(3, 0) NOT NULL ENABLE, "PRINT_FLAG" NUMBER(1, 0) "
"NOT NULL ENABLE, "CYCLE_DATE" DATE NOT NULL ENABLE, "DUE_DATE" DATE NOT NUL"
"L ENABLE, "MULTIMONTH_CNT" NUMBER(2, 0), "MULTIMONTH_AMT" NUMBER(12, 0)) P"
"CTFREE 10 PCTUSED 40 INITRANS 1 MAXTRANS 255 LOGGING STORAGE(INITIAL 239796"
"224 NEXT 4194304 MINEXTENTS 1 MAXEXTENTS 240 PCTINCREASE 0 FREELISTS 1 FREE"
"LIST GROUPS 1 BUFFER_POOL DEFAULT) TABLESPACE "TBL_CUST_LED_C45966""
. . skipping table "CUST_LEDGER_BALANCE"
IMP-00009: abnormal end of export file
Import terminated successfully with warnings.
..they were transferred in binary mode. Same result if I specify 2048M filesize in import string. Only seems to affect the table that is broken into 2 files, others seem OK.
Thanks!
Adam -
J1INQEFILE - efile generation - Exported file shows Duplicate records.
Dear Team,
When I execute J1INQEFILE, I am facing problem with the e-file generation i.e. exported Excel file. When I execute and export the file in excel to the desktop, I can see duplicate records.
For eg. On execution of J1INQEFILE, I can see 4 records on the SAP screen, whereas the exported file to the desktop shows me 2 more identical records i.e 6 records. As a result, in the SAP system i can see Base Amount as 40000 ie. 10000 for each. on the contrary the excel sheet shows me 60000 i.e. 6 records of 10000 each (bcse of 2 more duplicated records) and also shows the TDS amount wrong. How are the records getting duplicated? Is there any SAP note to fix this? We are debugging on this but no clue....
Please assist on this issue....
Thanks in Advance !!!!Dear Sagar,
I am an abaper,
Even I came across the same kind of situation for one of our client ,When we execute J1INQEFILE, after exporting the same to Excel file we use to get duplicate records.
For this I have Debug the program and checked at point of efile generation, there duplicate records were getting appended for the internal table that is downloaded for Excel, so I have pulled the Document number in to Internal table and used Delete Adjacent duplicates by comparing all fields and hence able to resolve the issue.
Hope the same logic helps or guide you to proceed with the help of an abaper.
<<Text removed>>
Regards,
Kalyan
Edited by: Matt on Sep 8, 2011 9:14 PM -
Read two CSV files and remove the duplicate values within them.
Hi,
I want to read two CSV files(which contains more than 100 rows and 100 columns) and remove the duplicate values within that two files and merge all the unique values and display it as a single file.
Can anyone help me out.
Thanks in advance.kirthi wrote:
Can you help me....Yeah, I've just finished... Here's a skeleton of my solution.
The first thing I think you should do is write a line-parser which splits your input data up into fields, and test it.
Then fill out the below parse method, and test it with that debugPrint method.
Then go to work on the print method.
I can help a bit along the way, but if you want to do this then you have to do it yourself. I'm not going to do it for you.
Cheers. Keith.
package forums.kirthi;
import java.util.*;
import java.io.PrintStream;
import java.io.BufferedReader;
import java.io.FileReader;
import java.io.IOException;
import krc.utilz.io.ParseException;
import krc.utilz.io.Filez.LineParser;
import krc.utilz.io.Filez.CsvLineParser;
public class DistinctColumnValuesFromCsvFiles
public static void main(String[] args) {
if (args.length==0) args = new String[] {"input1.csv", "input2.csv"};
try {
// data is a Map of ColumnNames to Sets-Of-Values
Map<String,Set<String>> data = new HashMap<String,Set<String>>();
// add the contents of each file to the data
for ( String filename : args ) {
data.putAll(parse(filename));
// print the data to output.csv
print(data);
} catch (Exception e) {
e.printStackTrace();
private static Map<String,Set<String>> parse(String filename) throws IOException, ParseException {
BufferedReader reader = null;
try {
reader = new BufferedReader(new FileReader(filename));
CsvLineParser.squeeze = true; // field.trim().replaceAll("\\s+"," ")
LineParser<String[]> parser = new CsvLineParser();
int lineNumber = 1;
// 1. read the column names (first line of file) into a List
// 2. read the column values (subsequent lines of file) into a List of Set's of String's
// 3. build a Map of columnName --> columnValues and return it
} finally {
if(reader!=null)reader.close();
private static void debugPrint(Map<String,Set<String>> data) {
for ( Map.Entry<String,Set<String>> entry : data.entrySet() ) {
System.out.println("DEBUG: "+entry.getKey()+" "+Arrays.toString(entry.getValue().toArray(new String[0])));
private static void print(Map<String,Set<String>> data) {
// 1. get the column names from the table.
// 2. create a List of List's of String's called matrix; logically [COL][ROW]
// 3. print the column names and add the List<String> for this col to the matrix
// 4. print the matrix by inerating columns and then rows
} -
1.2.0 reads TNSNAMES.ORA file multiple times and duplicates values in Alias
Since upgrading to 1.2.0.29.98 (clean install on Win XP SP 2), I have noticed that SQL Developer appears to be reading the TNSNAMES.ORA file multiple times. This results in the Network Alias pop-list on the TNS connection type to have duplicate values, although the number of duplicates for each entry varies widely - from four for the least that I saw and 252 for the most (assuming I counted that right :) ). We have approximately 270 entries in our TNSNAMES.ORA and the first entry in the file appears in the Network Alias list four times and the last entry in the file appears in the list 48 times.
To be honest, I switched to Basic JDBC connections with 1.1 and only noticed the problem because of some network performance issues - I thought that SQL Developer had hung and I switched on the debugging and could see that it was looping through the TNSNAMES.ORA file. Now that the network performance issues have been resolved, it still takes a little while to open the new connection window, but it is liveable, so I don't know whether the problem is 1.2 specific or not.Sue,
I assume from a bit of testing, that "each tns file on the system" means each file in the TNS_ADMIN location that starts with TNSNAMES.ORA.
I set my TNS_ADMIN to a local location and copied the current TNSNAMES.ORA from the network location (and chopped it down to a handful of entries). When I restarted SQL Developer I only had a single copy of each alias. If I copied TNSNAMES.ORA to "Copy of TNSNAMES.ORA" I still only had a single copy of each alias. If I copied TNSNAMES.ORA to TNSNAMES.ORA.TXT I then had two copies of each alias.
Unfortunately, I don't have any say in the maintenance of the network TNS_ADMIN location and it has almost 200 backup copies of the tns file, typically named TNSNAMES.ORA.YYYYMMDD.
My TNS_ADMIN setting is done as a Windows environment variable. I do not have any TNS_ADMIN setting in my registry. -
NDS error: duplicate value (-614) on Generic LDAP Export to NetIQ eDirectory
Dear community,
using the Generic LDAP Agent, the latest eDirectory (8.8.SP8 (20806.01) and FIM Version (4.1.3627.0)) I encounter the following problem in very special situations (namely when the value in eDirectory only differs from the FIM value by different upper/lower
case letters:
NDS error: duplicate value (-614)
DirectoryOperationException: (0) 0 Server Message: The attribute exists or the value has been assigned.I don't see that as a Problem, when it is in fact doing string comparison. You may need to write an advanced flow rule to simply say something like this
CSHARP Snippet. (if not equal, case does not matter)
if !(csentry["co"].ToUpper().Equals(mventry["co"].ToUpper()))
csentry["co"].Value =mventry["co"].Value ;
Nosh Mernacaj, Identity Management Specialist -
Duplicate value in Command extract report
Hi Experts
I am getting a duplicate record when extracting the BOM which is so called Material Extract in my words
when running the report I am getting the duplicate value for the cost estimate. near the form extract_bom_details under select queries
Can anyone help me why I am getting so by seeing the coding please.
TYPES:
BEGIN OF ty_bom,
matnr(12) TYPE c, " Material #
werks(4) TYPE c, " Plant
stprs(10) TYPE c, " Quantity
meins(3) TYPE c, " Unit of Measure
END OF ty_bom,
it_ty_bom TYPE ty_bom OCCURS 0,
*Start of block of changes for JPC20061107
BEGIN OF ty_bom_sapfmt,
matnr TYPE MARC-MATNR, " Material #
werks TYPE MARC-WERKS, " Plant
stprs TYPE MBEW-STPRS, " Quantity
meins TYPE MARA-MEINS, " Unit of Measure
END OF ty_bom_sapfmt.
Report COMMAND_EXTRACT_D *
REPORT command_extract_d .
*CLASS cl_gui_control DEFINITION LOAD.
*CLASS cl_gui_frontend_services DEFINITION LOAD.
Tables
TABLES:
kna1,
knb1,
knvv,
mara,
mast, "JPC20061107
marc,
makt,mbew,
sscrfields.
CONSTANTS: BEGIN OF gc_status,
acc TYPE zcrstat1 VALUE ' ACC',
hol TYPE zcrstat1 VALUE ' HOL',
sto TYPE zcrstat1 VALUE ' STO',
ok TYPE zcrstat1 VALUE ' OK',
hold TYPE zcrstat1 VALUE 'HOLD',
END OF gc_status.
CONSTANTS: BEGIN OF gc_reason,
000 TYPE zreason VALUE '000',
001 TYPE zreason VALUE '001',
002 TYPE zreason VALUE '002',
003 TYPE zreason VALUE '003',
004 TYPE zreason VALUE '004',
005 TYPE zreason VALUE '005',
006 TYPE zreason VALUE '006',
007 TYPE zreason VALUE '007',
008 TYPE zreason VALUE '008',
010 TYPE zreason VALUE '010',
011 TYPE zreason VALUE '011',
021 TYPE zreason VALUE '021',
022 TYPE zreason VALUE '022',
023 TYPE zreason VALUE '023',
024 TYPE zreason VALUE '024',
025 TYPE zreason VALUE '025',
026 TYPE zreason VALUE '026',
999 TYPE zreason VALUE '999',
END OF gc_reason.
TYPES: BEGIN OF ty_kna1_fields,
kunnr TYPE kna1-kunnr,
sperr TYPE kna1-sperr,
aufsd TYPE kna1-aufsd,
lifsd TYPE kna1-lifsd,
faksd TYPE kna1-faksd,
loevm TYPE kna1-loevm,
END OF ty_kna1_fields.
TYPES: BEGIN OF ty_knb1_fields,
kunnr TYPE knb1-kunnr,
bukrs TYPE knb1-bukrs,
sperr TYPE knb1-sperr,
loevm TYPE knb1-loevm,
END OF ty_knb1_fields.
TYPES: BEGIN OF ty_knvv_fields,
kunnr TYPE knvv-kunnr,
vkorg TYPE knvv-vkorg,
vtweg TYPE knvv-vtweg,
spart TYPE knvv-spart,
aufsd TYPE knvv-aufsd,
lifsd TYPE knvv-lifsd,
faksd TYPE knvv-faksd,
END OF ty_knvv_fields.
TYPES: BEGIN OF ty_knkk_fields,
kunnr TYPE knkk-kunnr,
kkber TYPE knkk-kkber,
ctlpc TYPE knkk-ctlpc,
crblb TYPE knkk-crblb,
knkli TYPE knkk-knkli,
klimk TYPE knkk-klimk,
skfor TYPE knkk-skfor,
ssobl TYPE knkk-ssobl,
END OF ty_knkk_fields.
TYPES: BEGIN OF ty_cust_stat_output,
kunnr(10) TYPE c, "Customer #
stat(4) TYPE c, "Customer status
END OF ty_cust_stat_output.
TYPES:
BEGIN OF ty_customers,
kunnr(10) TYPE c, " Customer #
div1(1) TYPE c, " Pipe delimiter
name1(32) TYPE c, " Customer name
div2(1) TYPE c, " Pipe delimiter
altkn(8) TYPE c, " Old Customer #
div3(1) TYPE c, " Pipe delimiter
stras(30) TYPE c, " Street
div4(1) TYPE c, " Pipe delimiter
ort01(20) TYPE c, " City
div5(1) TYPE c, " Pipe delimiter
regio(3) TYPE c, " State
div6(1) TYPE c, " Pipe delimiter
pstlz(4) TYPE c, " Postcode
div7(1) TYPE c, " Pipe delimiter
telf1(14) TYPE c, " Phone 1
div8(1) TYPE c, " Pipe delimiter
telf2(14) TYPE c, " Phone 2
div9(1) TYPE c, " Pipe delimiter
erdat(10) TYPE c, " date
div10(1) TYPE c, " Pipe delimiter
splant(2) TYPE c, " plant
div11(1) TYPE c, " Pipe delimiter
END OF ty_customers,
it_ty_customers TYPE ty_customers OCCURS 0,
BEGIN OF ty_custstat,
kunnr(11) TYPE c, " Customer #
company(2) TYPE c, " #
status(3) TYPE c, " acc,cod,hol,
END OF ty_custstat,
it_ty_custstat TYPE ty_custstat OCCURS 0.
TYPES:
BEGIN OF ty_materialm,
matnr(12) TYPE c, " Material #
maktx2(40) TYPE c, " Command sales desc.
maktx(16) TYPE c, " Basic description
extwg(6) TYPE c, " External Material Group
flag1(1) TYPE c, "
flag2(1) TYPE c, "
flag3(1) TYPE c, "
flag4(1) TYPE c, "
END OF ty_materialm,
it_ty_materialm TYPE ty_materialm OCCURS 0,
BEGIN OF ty_materialp,
matnr(12) TYPE c, " Material #
werks(4) TYPE c, " plant
batch(1) TYPE c, " download to batch
END OF ty_materialp,
it_ty_materialp TYPE ty_materialp OCCURS 0.
TYPES:
BEGIN OF ty_bom,
matnr(12) TYPE c, " Material #
werks(4) TYPE c, " Plant
stprs(10) TYPE c, " Quantity
meins(3) TYPE c, " Unit of Measure
END OF ty_bom,
it_ty_bom TYPE ty_bom OCCURS 0,
*Start of block of changes for JPC20061107
BEGIN OF ty_bom_sapfmt,
matnr TYPE MARC-MATNR, " Material #
werks TYPE MARC-WERKS, " Plant
stprs TYPE MBEW-STPRS, " Quantity
meins TYPE MARA-MEINS, " Unit of Measure
END OF ty_bom_sapfmt.
Selection screen definition
SELECTION-SCREEN: BEGIN OF BLOCK bom WITH FRAME TITLE text-004.
PARAMETERS:
p_dbom LIKE filepath-pathintern DEFAULT 'Z_COMMAND_BOM_EXTRACT',
p_fbom LIKE rlgrap-filename.
SELECT-OPTIONS:
s_bmatnr FOR MARA-MATNR, "JPC20061107
s_bwerks FOR MAST-WERKS no-extension no intervals, "JPC20061107
s_bextwg FOR mara-extwg.
SELECTION-SCREEN: END OF BLOCK bom.
SELECT-OPTIONS:
s_kunnr FOR kna1-kunnr.
SELECTION-SCREEN: END OF BLOCK ccst.
DATA: clsdir TYPE REF TO cl_gui_frontend_services.
DATA: strfolder TYPE string.
DATA: folderln TYPE i.
DATA: gva_error(1) TYPE c VALUE ' '.
Initial procedure on START ***************
INITIALIZATION.
p_fcust = 'CUST.prn'.
p_fmatm = 'MATM.prn'.
p_fmatp = 'MATP.prn'.
p_fbom = 'BOM.prn'.
p_fccst = 'CCSTAT.prn'.
p_fccst2 = 'CCSTATC.prn'.
p_fccstl = 'CCSTATL.prn'.
s_dextwg-option = 'BT'.
s_dextwg-low = '1'.
s_dextwg-high = '8'.
APPEND s_dextwg.
s_bextwg-option = 'BT'.
s_bextwg-low = '1'.
s_bextwg-high = '1'.
APPEND s_bextwg.
AT SELECTION-SCREEN.
IF p_ccust EQ 'X' AND ( p_fcust IS INITIAL OR p_dcust IS INITIAL ).
MESSAGE s000(zppu)
WITH 'You must specify the file details for the customer data'.
gva_error = 'X'.
ENDIF.
IF p_cmatm EQ 'X' AND ( p_fmatm IS INITIAL OR
p_fmatp IS INITIAL OR
p_dmatm IS INITIAL ).
MESSAGE s001(zppu)
WITH 'You must specify the file details for the material data'.
gva_error = 'X'.
ENDIF.
IF p_cbom EQ 'X' AND ( p_fbom IS INITIAL OR p_dbom IS INITIAL ).
MESSAGE s002(zppu)
WITH 'You must specify the file details for the BOM data'.
gva_error = 'X'.
ENDIF.
IF p_cbom EQ 'X'.
IF s_bwerks-low is initial.
MESSAGE s002(zppu)
WITH 'You must specify a plant to run BOM extract for'.
gva_error = 'X'.
ENDIF.
IF LINES( s_bwerks ) > 1.
MESSAGE s002(zppu)
WITH 'You can only specify 1 plant for BOM extract'.
gva_error = 'X'.
ENDIF.
ENDIF.
IF p_ccst EQ 'X' AND ( p_fccst IS INITIAL OR
p_fccst2 IS INITIAL OR
p_fccstl IS INITIAL OR
p_dccst IS INITIAL ).
MESSAGE s002(zppu)
WITH 'You must specify the file details for the customer'
'status data'.
gva_error = 'X'.
ENDIF.
AT SELECTION-SCREEN OUTPUT.
PERFORM user_command.
AT SELECTION-SCREEN ON VALUE-REQUEST FOR p_fcust.
PERFORM get_gui_filename USING p_dcust p_fcust.
AT SELECTION-SCREEN ON VALUE-REQUEST FOR p_fmatm.
PERFORM get_gui_filename USING p_dmatm p_fmatm.
AT SELECTION-SCREEN ON VALUE-REQUEST FOR p_fmatp.
PERFORM get_gui_filename USING p_dmatm p_fmatp.
AT SELECTION-SCREEN ON VALUE-REQUEST FOR p_fbom.
PERFORM get_gui_filename USING p_dbom p_fbom.
AT SELECTION-SCREEN ON VALUE-REQUEST FOR p_fccst.
PERFORM get_gui_filename USING p_dccst p_fccst.
AT SELECTION-SCREEN ON VALUE-REQUEST FOR p_fccst2.
PERFORM get_gui_filename USING p_dccst p_fccst2.
AT SELECTION-SCREEN ON VALUE-REQUEST FOR p_fccstl.
PERFORM get_gui_filename USING p_dccst p_fccstl.
END-OF-SELECTION.
Data selection execution.
DATA: lwa_kna1 TYPE kna1,
lwa_knb1 TYPE knb1,
lwa_knvv TYPE knvv.
DATA: lwa_marc TYPE marc,
lwa_mara TYPE marav,
lwa_makt TYPE makt.
DATA: lit_marav TYPE marav OCCURS 0.
DATA: lit_customers TYPE it_ty_customers,
lwa_customers TYPE ty_customers.
DATA: lit_custstat TYPE it_ty_custstat,
lwa_custstat TYPE ty_custstat.
DATA: lit_materialm TYPE it_ty_materialm,
lwa_materialm TYPE ty_materialm.
DATA: lit_materialp TYPE it_ty_materialp,
lwa_materialp TYPE ty_materialp.
*(del)DATA: lwa_bomlist TYPE ty_bomlist, "JPC20061107
*(del) lit_bomlist TYPE it_ty_bomlist. "JPC20061107
DATA: lwa_bom TYPE ty_bom_sapfmt,
lit_bom TYPE it_ty_bom WITH HEADER LINE. "JPC20061107
lit_bom TYPE it_ty_bom. "JPC20061107
DATA: output_file TYPE string.
DATA: lock_file TYPE string.
DATA: lva_mssage TYPE string.
DATA: lva_matnr(18) TYPE n.
DATA: lva_date TYPE datum.
DATA: txtper(3) TYPE c.
DATA: custper TYPE i.
DATA: custcount TYPE i.
DATA: itemnum TYPE i.
CHECK gva_error <> 'X'.
IF p_ccust EQ 'X'. " Do the customer file extract
PERFORM extract_customer_details.
ENDIF.
IF p_cmatm EQ 'X'. " Do the Material file extract
PERFORM extract_material_details.
ENDIF.
IF p_cbom EQ 'X'. " Do the BOM file extract
PERFORM extract_bom_details.
ENDIF.
IF p_ccst EQ 'X'. " Do the customer credit status file extract
PERFORM extract_credit_status_details.
ENDIF.
*& Form extract_customer_details
text
--> p1 text
<-- p2 text
FORM extract_customer_details.
SELECT * FROM knb1
INTO lwa_knb1
WHERE bukrs = p_dcomp.
MOVE-CORRESPONDING lwa_knb1 TO lwa_customers.
MOVE-CORRESPONDING lwa_knb1 TO lwa_custstat.
MOVE '#' TO lwa_custstat-company.
MOVE p_dwerk TO lwa_customers-splant.
--WG1K903075: Start Delete-
move: '|' to lwa_customers-div1,
'|' to lwa_customers-div2,
'|' to lwa_customers-div3,
'|' to lwa_customers-div4,
'|' to lwa_customers-div5,
'|' to lwa_customers-div6,
'|' to lwa_customers-div7,
'|' to lwa_customers-div8,
'|' to lwa_customers-div9,
'|' to lwa_customers-div10,
'|' to lwa_customers-div11.
--WG1K903075: End Delete---
--WG1K903075: Start Insert-
MOVE: ' ' TO lwa_customers-div1,
' ' TO lwa_customers-div2,
' ' TO lwa_customers-div3,
' ' TO lwa_customers-div4,
' ' TO lwa_customers-div5,
' ' TO lwa_customers-div6,
' ' TO lwa_customers-div7,
' ' TO lwa_customers-div8,
' ' TO lwa_customers-div9,
' ' TO lwa_customers-div10,
' ' TO lwa_customers-div11.
--WG1K903075: End Insert---
APPEND lwa_customers TO lit_customers.
APPEND lwa_custstat TO lit_custstat.
ENDSELECT.
DESCRIBE TABLE lit_customers LINES custcount.
LOOP AT lit_customers INTO lwa_customers.
txtper = 100 * sy-tabix / custcount.
custper = txtper.
SELECT SINGLE * FROM kna1
INTO lwa_kna1
WHERE kunnr = lwa_customers-kunnr.
MOVE-CORRESPONDING lwa_kna1 TO lwa_customers.
lva_date = lwa_customers-erdat.
CALL FUNCTION 'CONVERT_DATE_TO_EXTERNAL'
EXPORTING
date_internal = lva_date
IMPORTING
date_external = lwa_customers-erdat
EXCEPTIONS
date_internal_is_invalid = 1
OTHERS = 2.
REPLACE ALL OCCURRENCES OF '.'
IN lwa_customers-erdat WITH '/'.
CALL FUNCTION 'CONVERSION_EXIT_ALPHA_OUTPUT'
EXPORTING
input = lwa_customers-kunnr
IMPORTING
output = lwa_customers-kunnr.
WRITE: lwa_customers-kunnr RIGHT-JUSTIFIED TO lwa_customers-kunnr.
MODIFY lit_customers FROM lwa_customers.
Percentage indicator display
lva_mssage = 'Extracting customers'.
CALL FUNCTION 'SAPGUI_PROGRESS_INDICATOR'
EXPORTING
percentage = custper
text = lva_mssage.
ENDLOOP.
IF p_gui IS INITIAL.
PERFORM get_file_path USING p_dcust p_fcust output_file.
OPEN DATASET output_file FOR OUTPUT IN TEXT MODE ENCODING DEFAULT.
IF sy-subrc EQ 0.
LOOP AT lit_customers INTO lwa_customers.
TRANSFER lwa_customers TO output_file.
ENDLOOP.
CLOSE DATASET output_file.
ELSE.
MESSAGE s002(zppu)
WITH 'Dataset' output_file 'cannot be opened on the server'.
ENDIF.
ELSE.
MOVE p_fcust TO output_file.
CALL FUNCTION 'GUI_DOWNLOAD'
EXPORTING
filename = output_file
filetype = 'ASC'
TABLES
data_tab = lit_customers
EXCEPTIONS
OTHERS = 11.
ENDIF.
output_file = custcount.
CONCATENATE
output_file
' Customers extracted'
INTO
lva_mssage.
message lva_mssage type 'I'.
ENDFORM. " extract_customer_details
*& Form extract_bom_details
text
--> p1 text
<-- p2 text
FORM extract_bom_details.
select MARAMATNR MBEWBWKEY MBEWSTPRS MARAMEINS "JPC20061220
into lwa_bom
from ( MARAV AS MARA
inner join MARC
ON MARAMATNR = MARCMATNR
INNER join MBEW
ON MARCMATNR = MBEWMATNR
AND MARCWERKS = MBEWBWKEY
where MARA~MATNR in s_bmatnr
and MBEW~BWKEY in s_bwerks
and MARA~EXTWG in s_bextwg.
clear lit_bom.
WRITE lwa_bom-matnr TO lit_bom-matnr.
MOVE lwa_bom-werks TO lit_bom-werks.
MOVE: lwa_bom-stprs TO lit_bom-stprs,
lwa_bom-meins TO lit_bom-meins.
APPEND lit_bom.
ENDSELECT.
SORT lit_bom ascending.
Ending for lines inserted for change 20061107
IF p_gui IS INITIAL. "write to server
PERFORM get_file_path USING p_dbom p_fbom output_file.
OPEN DATASET output_file FOR OUTPUT IN TEXT MODE ENCODING DEFAULT.
IF sy-subrc EQ 0.
custcount = lines( lit_bom ).
LOOP AT lit_bom.
txtper = 100 * sy-tabix / custcount.
custper = txtper.
lva_mssage = 'Extracting material B.O.Ms'.
CALL FUNCTION 'SAPGUI_PROGRESS_INDICATOR'
EXPORTING percentage = custper
text = lva_mssage.
TRANSFER lit_bom TO output_file.
ENDLOOP.
CLOSE DATASET output_file.
ELSE.
MESSAGE s002(zppu)
WITH 'Dataset' output_file 'cannot be opened on the server'.
ENDIF.
ELSE.
MOVE p_fbom TO output_file.
CALL FUNCTION 'GUI_DOWNLOAD'
EXPORTING
filename = output_file
filetype = 'ASC'
write_field_separator = ' '
TABLES
data_tab = lit_bom
EXCEPTIONS
OTHERS = 11.
ENDIF.
ENDFORM. " extract_bom_details
output with duplicate values
MARA BWKEY STPRS MEINS
10004989 BN01 28.00 TO
10004989 BN01 28.00 TO
10005010 BN01 19.00 EA
10005010 BN01 19.00 EA
10005018 BN01 800.00 BAG
10005018 BN01 800.00 BAG
10005115 BN01 82.74 TO
10005115 BN01 82.74 TO
10005117 BN01 137.30 TO
10005117 BN01 137.30 TO
Regards
PirozHi,
Use this statement then duplicate entries removed.
DELETE ADJACENT DUPLICATES FROM table-field.
regards,
Lakshminarayana -
46C Migration Oracle/HP-UX to MAXDB/SLE10 error: Duplicate value QCMT066
To migrate a productive system from 46C to ECC600 I got an system copy export from our business partner.
The source system is running on HPUX/Oracle. The target system should be MAXDB/SLE10.
I run the import with R3setup and got error: Duplicate value QCMT066.
To try to continue the import I set in DBMIG.R3S:
[DBR3LOADEXEC_IND_ADA]
from STATUS=ERROR to STATUS=OK
[DBR3LOADVIEW_IND_ADA]
from STATUS=ERROR to STATUS=OK
[DBR3LICENSECREATE_IND_ADA]
STATUS=ERROR to STATUS=OK
[DBR3START_IND_ADA]
from STATUS=ERROR to STATUS=OK
Tthe system is coming up, but get: in SAPGUI: GEN_TABLE_NOT_EXISTS .. Table VSEOPARENT does not exist.
I believe that I have to run the import again. How can I solve the duplicate value QCMT066 problem?
Edited by: Trieu Chi Phung on Aug 4, 2009 4:01 PMAnswer of SAP:
Did you skip the error? because there errors in both SAPVIEW.log and
SAPAPPL1.log
The SAPVIEW is having error because the table VBKD was not imported yet, and this table belongs to package SAPAPPL1, so, you have to finish
import the SAPAPPL1 in the first place.
You have the error "Duplicate key in index" for table QCMT066
One of the most important things to do before a migration starts is to
look at the consistency between Database and ABAP DDIC (/DB02 > checks>
button Database<> ABAP/4 DDIC) and to look after QCM-Tables from pre-
viously failed table-conversions These temporary objects are used duringconversion (see attached note # 9385 this note is for 3.0F but explains
the situation).
I would like you to proceed as follows:
1. In the source-system check SE14 > Extras > Invalid Temp.
Tables and remove all entries from there
2. Switch to use sidadm create a new temp-directory and run 'R3ldct
without any parameters
3. Check the STR-files created and grep for entries starting with
QCM
4. For those objects use function module 'DD_NAMETAB_DELETE' and
remove them from the nametab
5. repeat the export from scratch
If you want a workaround on this, you can modify the <package>.STR file
and remove the entry QCMT066 and restart the import to continue.
However, this maybe tedious if you and lots of this kind of object. -
A client has the following problem: When trying to export files in the original Raw format (.CR2), the operation failed and a dialog informs us that "Some export operations were not performed". We can export JPEGs just fine.
After lots of testing, this is what we discovered: The problem does not appear to be catalog-based; we run into the same problem using a test catalog. The export operation works fine, however, with the same test catalog and images on two other machines, and it even works fine on the same machine with a new user admin account that we created.
So it appears to be something associated with the user account, , or with how LR is interacting with the user account, but I cannot figure out what it might be. We can copy the Raw files when using the Finder to do so. But not via Lightroom. Any thoughts, ideas of suggestion would be greatly appreciated! Thanks, SeanAs part of your testing make sure you've tried exporting JPGs from the same folder as the CR2s you're exporting as original, and export JPGs from CR2s to the same folder as you're exporting your originals. This would test LR's read and write permissions from and to the folders being used.
From this document you can see that all the LR files other than the program, are under the /Users folder with different values possible for each user, so the difference could be that the users that work are using the original LR defaults, if you have just started using LR with them after you started having this problem, or at least that the user with the problem has some oddball settings that are interfering with Export as Original.
Preference and other file locations in Lightroom 5
You can test for the Preferences being off by either copying the preferences from a user that has the problem to a user that doesn't have the problem--probably through an external or common folder, or be deleting the preferences file from the user that has the problem and see if it resolves. Moving the preferences file or renaming it may be better than deleting it if no change is seen, then you can restore the preferences back by moving or renaming the file back to the original location and name. -
Colour duplicate values in html output
Hello,
I am looking to colour duplicate values in an html output file.
I have written a folder comparison ps script which shows output as below:
Name Length Version
Name length and version are the column names
Now there are many duplicate entries under name column which I need to highlight using any one colour. I have sorted the output under name using alphabetical order.
I just need to highlight all duplicate values using a particular colour.
Thanks in advance.Posting my script here:
# Get Start Time
$startDTM = (Get-Date)
$a = "<style>"
$a = $a + "BODY{background-color:peachpuff;}"
$a = $a + "TABLE{border-width: 1px;border-style: solid;border-color: black;border-collapse: collapse;}"
$a = $a + "TH{border-width: 1px;padding: 0px;border-style: solid;border-color: black;background-color:thistle}"
$a = $a + "TD{border-width: 1px;padding: 0px;border-style: solid;border-color: black;background-color:PaleGoldenrod}"
$a = $a + "</style>"
$folderReference = Read-Host "Enter Reference Folder Name"
$folderDifference = Read-Host "Enter Difference Folder Name"
$FolderReferenceContents = Get-ChildItem $folderReference -Recurse |
where-object {-not $_.PSIsContainer}
$FolderDifferenceContents = Get-ChildItem $folderDifference -Recurse |
where-object {-not $_.PSIsContainer}
#get only files that are on reference folder not base folder
$one = Compare-Object -ReferenceObject $FolderReferenceContents `
-DifferenceObject $FolderDifferenceContents -Property ('Name', 'Length','Get-FileVersionInfo') -PassThru |
where-object { $_.SideIndicator -eq '<='} | select Name, Length, LastWriteTime, VersionInfo
#get only files that are on base folder not on reference
$two = Compare-Object -ReferenceObject $FolderReferenceContents `
-DifferenceObject $FolderDifferenceContents -Property ('Name', 'Length','Get-FileVersionInfo') -PassThru |
where-object { $_.SideIndicator -eq '=>'} | select Name, Length, LastWriteTime, VersionInfo
$three = $one + $two | sort-object -property Name | ConvertTo-HTML -Fragment
# Get End Time
$endDTM = (Get-Date)
# Echo Time elapsed
$four = "Elapsed Time: $(($endDTM-$startDTM).totalseconds) seconds"
ConvertTo-HTML -Head $a -Body "$one $two $three $four" -Title "Output" | Out-File C:\output.html -
Hi
I want to let you know, that sometimes APEX 3.1.2.00.0 creates invalid export file.
Older apex 3.0 created correct file.
For example: our application has page button, where "Optional URL Redirect" is:
Page: &APP_PAGE_ID.
Request: FLOW_XMLP_OUTPUT_R11531800061044170_et
If we export application into file f110.sql and try to import it to the same workspace on the same server, we get error:
{color:#0000ff}ORA-20001: GET_BLOCK Error. ORA-20001: Execution of the statement was unsuccessful. ORA-06550: line 28, column 111: PLS-00103: Encountered the symbol "_" when expecting one of the following: ) , * & | = - + < / > at in is mod remainder not rem => .. <an exponent (**)> <> or != or ~= >= <= <> and or like LIKE2_ LIKE4_ LIKEC_ as between from using || multiset member SUBMULTISET_ The symbol &a
{color}In exported f110.sql file the invalid section is:
{color:#0000ff}wwv_flow_api.create_page_branch(
p_id=>11762805016890347 + wwv_flow_api.g_id_offset,
p_flow_id=> wwv_flow.g_flow_id,
p_flow_step_id=> 4,
p_branch_action=> 'f?p=&APP_ID.:&APP_PAGE_ID.:&SESSION.:FLOW_XMLP_OUTPUT_R'||to_char({color}{color:#0000ff}{color:#ff0000}*10255206661122183_et*{color}+wwv_flow_api.g_id_offset)||':&DEBUG.:::',
p_branch_point=> 'AFTER_PROCESSING',
p_branch_type=> 'REDIRECT_URL',
p_branch_when_button_id=>11761415275883875+ wwv_flow_api.g_id_offset,
p_branch_sequence=> 10,
p_branch_comment=> 'Created 20-JUUNI-2008 12:05 by XXXX');{color}
If we exported the same application in apex 3.0, this section was correct:
{color:#0000ff}wwv_flow_api.create_page_branch(
p_id=>11762805016890347 + wwv_flow_api.g_id_offset,
p_flow_id=> wwv_flow.g_flow_id,
p_flow_step_id=> 4,
p_branch_action=> 'f?p=&APP_ID.:&APP_PAGE_ID.:&SESSION.:FLOW_XMLP_OUTPUT_R*10255206661122183_et*:&DEBUG.:::',
p_branch_point=> 'AFTER_PROCESSING',
p_branch_type=> 'REDIRECT_URL',
p_branch_when_button_id=>11761415275883875+ wwv_flow_api.g_id_offset,
p_branch_sequence=> 10,
p_branch_comment=> 'Created 20-JUUNI-2008 12:05 by XXXX');{color}
Best Regards,
TõnuThanks for pointing that out. We'll fix it. It does appear though, that in 3.0, the offset would not be added to the number part of the request value, so if you installed the application as a different application ID or into a different workspace, the request would have failed.
Scott -
The version number of export file in Java Card 2.2 is not correct
Hi all,
I used jdk1.3.1 and java card 2.2's exp2text tool to see the content of an export file. The value of minor_version is 1. But the JCVM 2.2 spec. says it should be 2. So which one is correct?
JoeyRead my post again....indicate binary compatibility or incompatibility between successive implementations of a package.
I can gerenate CAP files generated with JC 2.2., and load onto my JC 2.1.1 cards. AS LONG as I'm not using JC 2.2 implementation.
Run the exp2Text on javacard.framework.service and notice that it should read minor = 2. That's because to use that export, JC 2.2 is minimal required implementation. -
3.1 EA3 - Column value filter dropdown returns duplicate values
Hello,
SQL Developer 3.1 EA3 introduces a regression: the column value filter in the table data editor dropdown (filter column) now shows duplicate values. Previous versions (at least 3.0) returned unique values to select from.
Since this is EA3 and not yet production, should I file a bug for this?
Best Regards and many thanks for this tool,
Olivier.
Edited by: user9378013 on Jan 9, 2012 1:08 PMYes it's a regression. Fixed it recently. In the next released build it should work properly.
-Raghu -
Auto-populate Field value From a File Picker Window
I have a PDF form with many fields where my users are required to enter the filename of documents. I am using Adobe Acrobat 9 Pro
I've been trying to find a way to create a field Action that does the following:
User clicks field
File browser window opens
User selects relevant file
Filename and extension is placed into the field that was clicked
Any tips or suggestions on how to implement this would be welcome.
Additionally, if it is possible, could it support multiple field/documents in a single interactions. Example: User clicks field1 and selects 5 files. The files filenames are placed in fields1, field2, field3, field4, and field5 respectively.
P.S.
I am not trying to attach files to the PDF. I just want the filenames to be placed into the form.Are the files being selected PDF files, or other types? There's another method that can be used for file selection, but it only lists PDF files. It is possible (at least on Windows computers) to use it to select any file-type, but it's a bit tricky...
If not, using a hidden field is your only option. But you can reset if after copying the file-name to the other field, like this:
this.getField("FilePath").browseForFileToSubmit();
var filePath = this.getField("FilePath").value;
this.getField("FileName").value = filePath.substring(filePath.lastIndexOf("\\")+1);
this.getField("FilePath").value = "";
If you use this code (as the button's Mouse Up action), make sure you remove all other codes associated with the text fields. And you don't need to create a duplicate field for each file selection field, only one will do... It can even be placed in a doc-level function so you only have to call it from each button and have the code itself in a single location.
Maybe you are looking for
-
VGA to mini Displayport adaptors?
I suppose this question has been asked a million times before, but does anyone know of a VGA to mini DP adaptor? I'd like to connect my Dell Latitude E4310 (work lappy) to my 27" cinema display. Haven't had much luck finding the right gear, though. O
-
Aging and slowing computer, time to replace RAM?
I have a late 2008 Black MacBook with 4gb 667MHz DDR2 SDRAM and about 45gb (of 250gb) of free space on my HD. Lately, my poor MacBook has been getting extremely slow even with the most mundane of tasks like checking my email. I notice that my memory
-
Trying to place a new order (Transfer from Virgin/...
Hi all, I am trying to transfer across from TalkTalk/Virgin, and have hit a snag - I hit this error message when submitting for the package I want: Sorry- "We can see from your details that you've already placed an order to stop your current service.
-
How to change the textarea to html editer in knowledge maintenance page ?
Hi, experts, When we maintain the knowledge, we typed infomation into the textarea. Now we want to use html tags to show the knowledge info, so that the text can be shown in html table. how to change the textarea to html editer ? Thanks. Oliver.
-
TS1702 app store on ipod will not update apps, says it cannot connect to itunes store
This is on an ipod 4. I have hard booted it, I have checked wi-fi connections, I have download and unloaded new apps, I have checked my apple id and pw, I have synced with the desktop and did updates but that didn't allow them to update on my ipod.