Eliminate space in csv file

Hi
I have a script (see below) which runs and spools the information
select
STU_ID||','||CRSLIST||','||FIRST_NAME||','||SURNAME||','||MID_INITIAL||','||substr(rtrim(STU_ID),-1)||','||STUREF studata
from LOC_STUEMAIL
where STU_ACT = 'A'
order by STU_ID
I spool it to csv file. The STUREF is a number field.
The script does what it is suppose to do. The issue is there are lots of space in the spool file after STUREF field which I want to eliminate. I have tried RTRIM but it is not eliminating them.
Any help please

Hi HM
I tried trimspool, but it didnot help me.
@Ramesh: TRIM didnot help me either
This is the script
set linesize 192
column studata format A190
spool &OPPATH.ADD&RUN_NO..csv
select
'cn'||','||'Description'||','||'givenName'||','||'sn'||','||'initials'||','||'LastDigit'||','||'Stu_ID' studata
from dual
select
STU_ID||','||CRSLIST||','||FIRST_NAME||','||SURNAME||','||MID_INITIAL||','||substr(rtrim(STU_ID),-1)||','||STUREF studata
from LOC_STUEMAIL
where STU_ACT = 'A'
order by STU_ID
spool off
All the fields are fine except the last one.
Edited by: user12951692 on 20-Apr-2012 03:53

Similar Messages

  • Eliminate spaces in csv file

    Hi,
    db version:10.2.0.4.0
    o/s: windows 2003
    I have the following script which works fine except for one thing. I spool the output to csv file and after the last field (STUREF) there are so many spaces. STUREF is a number field. I tried using RTRIM but it doesnot help .
    set linesize 192
    column studata format A190
    spool &OPPATH.ADD&RUN_NO..csv
    select
    'cn'||','||'Description'||','||'givenName'||','||'sn'||','||'initials'||','||'LastDigit'||','||'Stu_ID' studata
    from dual
    select
    STU_ID||','||CRSLIST||','||FIRST_NAME||','||SURNAME||','||MID_INITIAL||','||substr(rtrim(STU_ID),-1)||','||STUREF studata
    from LOC_STUEMAIL
    where STU_ACT = 'A'
    order by STU_ID
    spool off Thank u for help

    Thank You for your response
    I just tried with LTRIM(RTRIM(STU_ID)),
    but the spaces still exist.
    STUDATA
    ZoS5071,ZH22151,Zo,Soraf,S,1,Zo5071
    BoeS6451,ZH23253,Zo,Skinner,?,1,Zo6451
    ToeW1100,ZP32611,oe,White,?,0,ZoeW110

  • How to avoid spaces in csv file at the time of spooling

    Hi all,
    I am spooling 30 query results into one CSV file ,I am getting two empty rows for each query.
    Can any one suggest me how to avoid these spaces.
    Thanks & Regards,
    P Prakash
    this is the script i am using to generate csv file .
    SET linesize 12000
    SET pagesize 10000
    SET pause off
    SET termout off
    SET feed off
    SET head off
    SPOOL c:\tes111.csv replace
    SELECT 'ISA01,ISA02,ISA03,ISA04,ISA05,ISA06,ISA07,ISA08,ISA09,ISA11,ISA12,ISA13,ISA14,ISA15,ISA16,GS01,GS02,GS03,GS04,GS05,GS06,GS07,GS08,ST01,ST02,ST03,BHT01,BHT02,BHT03,BHT04,BHT06,trnsctn_segment_count,included_trnsctn_sets_count,included_fnctnl_groups_count,input_acknwldgmnt_sid'
    FROM DUAL;
    SELECT ptr.athrztn_infrmtn_qlfr
    || ','
    || ptr.athrztn_infrmtn
    || ','
    || ptr.scrty_infrmtn_qlfr
    || ','
    || ptr.scrty_infrmtn
    || ','
    || ptr.intrchng_sndr_idntfr_qlfr
    || ','
    || ptr.intrchng_sndr_idntfr
    || ','
    || ptr.intrchng_rcvr_idntfr_qlfr
    || ','
    || ptr.intrchng_rcvr_idntfr
    || ','
    || ptr.intrchng_date
    || ','
    || ptr.intrchng_cntrl_stndrds_idntfr
    || ','
    || ptr.intrchng_cntrl_vrsn_nmbr
    || ','
    || ptr.intrchng_cntrl_nmbr
    || ','
    || ptr.acknwldgmnt_rqstd_indctr
    || ','
    || ptr.usg_indctr
    || ','
    || ptr.cmpnt_elmnt_sprtr
    || ','
    || ptr.fnctnl_idntfr_code
    || ','
    || ptr.aplctn_sndr_code
    || ','
    || ptr.applctn_rcvr_code
    || ','
    || ptr.fnctnl_grp_crtn_date
    || ','
    || ptr.fnctnl_grp_crtn_date
    || ','
    || ptr.grp_cntrl_nmbr
    || ','
    || ptr.rspnsbl_agncy_code
    || ','
    || ptr.vrsn_rls_indstry_idntfr_code
    || ','
    || ptr.trnsctn_set_idntfr_code
    || ','
    || ptr.trnsctn_set_cntrl_nmbr
    || ','
    || ptr.implementation_guide_vrsn_name
    || ','
    || ptr.hierarchical_structure_code
    || ','
    || ptr.trnsctn_set_purpose_lkpcd
    || ','
    || ptr.sbmtr_trnsctn_idntfr
    || ','
    || ptr.trnsctn_set_creation_date
    || ','
    || ptr.trnsctn_type_code
    || ','
    || ptr.trnsctn_segment_count
    || ','
    || ptr.included_trnsctn_sets_count
    || ','
    || ptr.included_fnctnl_groups_count
    || ','
    || ia.input_acknwldgmnt_sid
    || ','
    FROM pa_transaction_request ptr, input_acknwldgmnt ia, input_batch_file ibf
    WHERE ptr.input_acknwldgmnt_sid = ia.input_acknwldgmnt_sid
    AND ia.input_batch_file_sid = ibf.input_batch_file_sid
    AND ibf.original_file_name =
    'HIPAA.165760000.20110518I001.278_GC04-3.dat';
    SELECT 'pa_rqst_sid,NM01,NM02,NM108,NM109'
    FROM DUAL;
    SELECT pr.pa_rqst_sid
    || ','
    || pr.entity_idntfr_lkpcd
    || ','
    || pr.entity_type_qlfr
    || ','
    || pr.payer_idntfctn_code_qlfr
    || ','
    || pr.payer_idntfctn_code
    || ','
    FROM pa_request pr,
    pa_transaction_request ptr,
    input_acknwldgmnt ia,
    input_batch_file ibf
    WHERE pr.pa_trnsctn_rqst_sid = ptr.pa_trnsctn_rqst_sid
    AND ptr.input_acknwldgmnt_sid = ia.input_acknwldgmnt_sid
    AND ia.input_batch_file_sid = ibf.input_batch_file_sid
    AND ibf.original_file_name =
    'HIPAA.165760000.20110518I001.278_GC04-3.dat';
    SELECT 'meta_data_cid,data_value,iteration_number,Created_by,modified_by'
    FROM DUAL;
    SELECT prs.meta_data_cid
    || ','
    || prs.data_value
    || ','
    || prs.iteration_number
    || ','
    || prs.created_by
    || ','
    || prs.modified_by
    || ','
    FROM pa_request pr,
    pa_transaction_request ptr,
    input_acknwldgmnt ia,
    input_batch_file ibf,
    pa_request_situational prs
    WHERE pr.pa_rqst_sid = prs.pa_rqst_sid
    AND prs.stnl_target_table_cid = 100
    AND prs.target_table_sid = pr.pa_rqst_sid
    AND pr.pa_trnsctn_rqst_sid = ptr.pa_trnsctn_rqst_sid
    AND ptr.input_acknwldgmnt_sid = ia.input_acknwldgmnt_sid
    AND ia.input_batch_file_sid = ibf.input_batch_file_sid
    AND ibf.original_file_name =
    'HIPAA.165760000.20110518I001.278_GC04-3.dat';
    SELECT 'NM101 ,NM103,NM104,NM105,NM108,NM109,PER02,PER04,PER05,PER06,PER07 ,PER08'
    FROM DUAL;
    SELECT pr.pa_rqst_sid
    || ','
    || prpl.pa_rqst_x_prvdr_lctn_sid
    || ','
    || prpl.rqstr_entity_idntfr_lkpcd
    || ','
    || prpl.last_name
    || ','
    || prpl.first_name
    || ','
    || prpl.middle_name
    || ','
    || prpl.idntfr_type_cid
    || ','
    || prpl.prvdr_lctn_iid
    || ','
    || prpl.rqstr_contact_name
    || ','
    || prpl.cmnctn_nmbr_1
    || ','
    || prpl.cmnctn_nmbr_2
    || ','
    || prpl.cmnctn_nmbr_type_lkpcd_1
    || ','
    || prpl.cmnctn_nmbr_type_lkpcd_2
    || ','
    || prpl.cmnctn_nmbr_type_lkpcd_3
    FROM pa_request_x_provider_location prpl,
    pa_request pr,
    pa_transaction_request ptr,
    input_acknwldgmnt ia,
    input_batch_file ibf
    WHERE prpl.pa_rqst_sid = pr.pa_rqst_sid
    AND pr.pa_trnsctn_rqst_sid = ptr.pa_trnsctn_rqst_sid
    AND ptr.input_acknwldgmnt_sid = ia.input_acknwldgmnt_sid
    AND ia.input_batch_file_sid = ibf.input_batch_file_sid
    AND ibf.original_file_name =
    'HIPAA.165760000.20110518I001.278_GC04-3.dat'
    AND prpl.pa_prvdr_type_lkpcd = 'RR';
    SELECT 'meta_data_cid,data_value,iteration_number,Created_by,modified_by'
    FROM DUAL;
    SELECT prs.meta_data_cid
    || ','
    || prs.data_value
    || ','
    || prs.iteration_number
    || ','
    || prs.created_by
    || ','
    || prs.modified_by
    || ','
    FROM pa_request_x_provider_location prpl,
    pa_request pr,
    pa_transaction_request ptr,
    input_acknwldgmnt ia,
    input_batch_file ibf,
    pa_request_situational prs
    WHERE prpl.pa_rqst_sid = prs.pa_rqst_sid
    AND prs.stnl_target_table_cid = 102
    AND prs.target_table_sid = prpl.pa_rqst_x_prvdr_lctn_sid
    AND prpl.pa_rqst_sid = pr.pa_rqst_sid
    AND pr.pa_trnsctn_rqst_sid = ptr.pa_trnsctn_rqst_sid
    AND ptr.input_acknwldgmnt_sid = ia.input_acknwldgmnt_sid
    AND ia.input_batch_file_sid = ibf.input_batch_file_sid
    AND ibf.original_file_name =
    'HIPAA.165760000.20110518I001.278_GC04-3.dat'
    AND prpl.pa_prvdr_type_lkpcd = 'RR';
    -- 4. Subscriber
    NM103 Name Last or Organization Name
    NM104 Name First
    NM108 Identification Code Qualifier
    NM109 Identification Code
    DMG02 Date Time Period birth date
    DMG03 Gender Code
    SELECT ' NM103,NM104,NM108,NM109,DMG02,DMG03'
    FROM DUAL;
    SELECT prxm.last_name
    || ','
    || prxm.first_name
    || ','
    || prxm.idntfr_type_cid
    || ','
    || prxm.mbr_idntfr
    || ','
    || prxm.birth_date
    || ','
    || prxm.gender_lkpcd
    || ','
    FROM pa_request_x_member prxm,
    pa_request pr,
    pa_transaction_request ptr,
    input_acknwldgmnt ia,
    input_batch_file ibf
    WHERE prxm.pa_rqst_sid = pr.pa_rqst_sid
    AND pr.pa_trnsctn_rqst_sid = ptr.pa_trnsctn_rqst_sid
    AND ptr.input_acknwldgmnt_sid = ia.input_acknwldgmnt_sid
    AND ia.input_batch_file_sid = ibf.input_batch_file_sid
    AND ibf.original_file_name =
    'HIPAA.165760000.20110518I001.278_GC04-3.dat';
    -- 5. Subscriber Situational Info
    NM102 Entity Type Qualifier
    NM105 Name Middle
    NM107 Name Suffix
    REF01 Reference Identification Qualifier
    REF02 Reference Identification
    N301 Address Information
    N302 Address Information
    N401 City Name
    N402 State or Province Code
    N403 Postal Code
    N407 Country Subdivision Code
    SELECT 'meta_data_cid,data_value,iteration_number,Created_by,modified_by'
    FROM DUAL;
    SELECT prs.meta_data_cid
    || ','
    || prs.data_value
    || ','
    || prs.iteration_number
    || ','
    || prs.created_by
    || ','
    || prs.modified_by
    || ','
    FROM pa_request_x_member prxm,
    pa_request pr,
    pa_transaction_request ptr,
    input_acknwldgmnt ia,
    input_batch_file ibf,
    pa_request_situational prs
    WHERE prxm.pa_rqst_sid = pr.pa_rqst_sid
    AND pr.pa_trnsctn_rqst_sid = ptr.pa_trnsctn_rqst_sid
    AND ptr.input_acknwldgmnt_sid = ia.input_acknwldgmnt_sid
    AND ia.input_batch_file_sid = ibf.input_batch_file_sid
    AND prs.stnl_target_table_cid = 101
    AND prs.target_table_sid = prxm.pa_rqst_x_mbr_sid
    AND ibf.original_file_name =
    'HIPAA.165760000.20110518I001.278_GC04-3.dat';
    -- 6. Patient Event Level
    TRN02 Reference Identification
    TRN03 Originating Company Identifier
    TRN04 Reference Identification
    UM01 Request Category Code
    UM02 Certification Type Code
    UM03 Service Type Code
    UM04-1 Facility Code Value
    UM04-2 Facility Code Qualifier
    UM06 Level of Service Code
    UM07 Current Health Condition Code
    UM08 Prognosis Code
    UM09 Release of Information Code
    UM10 Delay Reason Code
    SELECT 'TRN02,TRN03,TRN04,UMO1,UMO2,UM04_1,UM04_2,UM06,UM07,UM08,UM09,UM10,'
    FROM DUAL;
    SELECT prd.patient_event_tracking_number
    || ','
    || prd.orginating_company_identifier
    || ','
    || prd.trace_assigning_entity_idntfr
    || ','
    || prs.rqst_ctgry_lkpcd
    || ','
    || prs.rqst_crtfctn_type_lkpcd
    || ','
    || prs.x12_pa_srvc_type_code
    || ','
    || prs.facility_type_code
    || ','
    || prs.unfrm_blng_facility_type_code
    || ','
    || prs.srvc_rqd_lkpcd
    || ','
    || prs.current_health_cndtn_lkpcd
    || ','
    || prs.prognosis_lkpcd
    || ','
    || prs.rls_of_info_lkpcd
    || ','
    || prs.delay_reason_lkpcd
    || ','
    FROM pa_request_service prs,
    pa_request pr,
    pa_transaction_request ptr,
    input_acknwldgmnt ia,
    input_batch_file ibf,
    pa_request_detail prd
    WHERE prs.pa_rqst_sid = pr.pa_rqst_sid
    AND pr.pa_trnsctn_rqst_sid = ptr.pa_trnsctn_rqst_sid
    AND ptr.input_acknwldgmnt_sid = ia.input_acknwldgmnt_sid
    AND ia.input_batch_file_sid = ibf.input_batch_file_sid
    AND pr.pa_rqst_sid = prd.pa_rqst_sid
    AND ibf.original_file_name =
    'HIPAA.165760000.20110518I001.278_GC04-3.dat';
    --Patient Event Level situational information
    SELECT 'meta_data_cid,data_value,iteration_number,Created_by,modified_by'
    FROM DUAL;
    SELECT prs.meta_data_cid
    || ','
    || prs.data_value
    || ','
    || prs.iteration_number
    || ','
    || prs.created_by
    || ','
    || prs.modified_by
    || ','
    FROM pa_request_service prsv,
    pa_request pr,
    pa_transaction_request ptr,
    input_acknwldgmnt ia,
    input_batch_file ibf,
    pa_request_situational prs
    WHERE prsv.pa_rqst_sid = pr.pa_rqst_sid
    AND pr.pa_trnsctn_rqst_sid = ptr.pa_trnsctn_rqst_sid
    AND ptr.input_acknwldgmnt_sid = ia.input_acknwldgmnt_sid
    AND ia.input_batch_file_sid = ibf.input_batch_file_sid
    AND prs.stnl_target_table_cid = 103
    AND prs.target_table_sid = prsv.pa_rqst_srvc_sid
    AND ibf.original_file_name =
    'HIPAA.165760000.20110518I001.278_GC04-3.dat';
    ---Diagnosis
    HI01 Health Care Code Information
    HI01-1 Code List Qualifier Code
    HI01-2 Industry Code
    HI01-3 Date Time Period Format Qualifier
    HI01-4 Date Time Period
    SELECT 'HI01_1,HI01_2,HI01_3'
    FROM DUAL;
    SELECT prd.pa_diagnosis_type_lkpcd
    || ','
    || prd.diagnosis_iid
    || ','
    || prd.from_date
    || ','
    FROM pa_request_x_diagnosis prd,
    pa_request pr,
    pa_transaction_request ptr,
    input_acknwldgmnt ia,
    input_batch_file ibf
    WHERE prd.pa_rqst_sid = pr.pa_rqst_sid
    AND pr.pa_trnsctn_rqst_sid = ptr.pa_trnsctn_rqst_sid
    AND ptr.input_acknwldgmnt_sid = ia.input_acknwldgmnt_sid
    AND ia.input_batch_file_sid = ibf.input_batch_file_sid
    AND ibf.original_file_name =
    'HIPAA.165760000.20110518I001.278_GC04-3.dat';
    -- Request detail
    TRN02 Reference Identification
    TRN03 Originating Company Identifier
    TRN04 Reference Identification
    SELECT 'TRN02,TRN03,TRN04'
    FROM DUAL;
    SELECT prd.patient_event_tracking_number
    || ','
    || prd.orginating_company_identifier
    || ','
    || prd.trace_assigning_entity_idntfr
    || ','
    FROM pa_request_detail prd,
    pa_request pr,
    pa_transaction_request ptr,
    input_acknwldgmnt ia,
    input_batch_file ibf
    WHERE prd.pa_rqst_sid = pr.pa_rqst_sid
    AND pr.pa_trnsctn_rqst_sid = ptr.pa_trnsctn_rqst_sid
    AND ptr.input_acknwldgmnt_sid = ia.input_acknwldgmnt_sid
    AND ia.input_batch_file_sid = ibf.input_batch_file_sid
    AND ibf.original_file_name =
    'HIPAA.165760000.20110518I001.278_GC04-3.dat';
    -- ambulance Transaportation
    CR101 Unit or Basis for Measurement Code
    CR102 Weight
    CR103 Ambulance Transport Code
    CR104 Ambulance Transport Reason Code
    CR105 Unit or Basis for Measurement Code
    CR106 Quantity
    CR109 Description
    CR110 Description
    SELECT 'CR101,CR102,CR103,CR104,CR105,CR106,CR109,CR110'
    FROM DUAL;
    SELECT pat.weight_uom_code
    || ','
    || pat.patient_weight
    || ','
    || pat.amblnc_transport_type_lkpcd
    || ','
    || pat.amblnc_transport_rsn_lkpcd
    || ','
    || pat.distance_uom_code
    || ','
    || pat.trnsprtn_distance
    || ','
    || pat.round_trip_purpose_desc
    || ','
    || pat.stretcher_purpose_desc
    || ','
    FROM pa_request_ambulance_transport pat,
    pa_request pr,
    pa_transaction_request ptr,
    input_acknwldgmnt ia,
    input_batch_file ibf
    WHERE pat.pa_rqst_sid = pr.pa_rqst_sid
    AND pr.pa_trnsctn_rqst_sid = ptr.pa_trnsctn_rqst_sid
    AND ptr.input_acknwldgmnt_sid = ia.input_acknwldgmnt_sid
    AND ia.input_batch_file_sid = ibf.input_batch_file_sid
    AND ibf.original_file_name =
    'HIPAA.165760000.20110518I001.278_GC04-3.dat';
    -- 2000F SV1
    SV101-1 Product/Service ID Qualifier Change values
    SV101-2 Product/Service ID Need to change label to Procedure Code.
    SV101-3 Procedure Modifier No change required to PA screen
    SV101-4 Procedure Modifier No change required to PA screen
    SV101-5 Procedure Modifier No change required to PA screen
    SV101-6 Procedure Modifier No change required to PA screen
    SV101-7 Description No change required to PA screen. Will map to the Remarks field on the Screen
    SV102 Monetary Amount
    SV103 Unit or Basis for Measurement Code
    SV104 Quantity
    SV107 Composite Diagnosis Code Pointer
    SV107-1 Diagnosis Code Pointer
    SV107-2 Diagnosis Code Pointer
    SV107-3 Diagnosis Code Pointer
    SV107-4 Diagnosis Code Pointer
    SV111 Yes/No Condition or Response Code
    SV120 Level of Care Code
    SELECT 'TRN_Reference_Identif,TRN_Originating_CMP_Identif,TRN_Reference_IdentiReference Identiff,Reference_Identif,Reference_Identification,Reference_Identif,Product/Service_ID_Qualifier,Product/Service_ID,Procedure_Modifier,Procedure_Modifier,Procedure_Modifier,Procedure_Modifier,Description'
    FROM DUAL;
    SELECT prp.srvc_trace_nmbr_1
    || ','
    || prp.trace_asgn_enty_adtnl_idntfr_1
    || ','
    || prp.trace_asgn_enty_adtnl_idntfr_1
    || ','
    || prp.trace_asgn_enty_adtnl_idntfr_2
    || ','
    || prp.prvs_rvw_athrztn_nmbr
    || ','
    || prp.prvs_administrative_rfrnc_nmbr
    || ','
    || prp.x12_code_list_qlfr_lkpcd
    || ','
    || prp.procedure_iid
    || ','
    || prp.mdfr_code
    || ','
    || prp.mdfr2_code
    || ','
    || prp.mdfr3_code
    || ','
    || prp.mdfr4_code
    || ','
    || prp.drug_desc
    || ','
    FROM pa_request_procedure prp,
    pa_request_service prs,
    pa_request pr,
    pa_transaction_request ptr,
    input_acknwldgmnt ia,
    input_batch_file ibf
    WHERE prp.pa_rqst_srvc_sid = prs.pa_rqst_srvc_sid
    AND prs.pa_rqst_sid = pr.pa_rqst_sid
    AND pr.pa_trnsctn_rqst_sid = ptr.pa_trnsctn_rqst_sid
    AND ptr.input_acknwldgmnt_sid = ia.input_acknwldgmnt_sid
    AND ia.input_batch_file_sid = ibf.input_batch_file_sid
    AND ibf.original_file_name =
    'HHIPAA.165760000.20110518I001.278_GC04-3.dat';
    SV2 SV201 Product/Service ID
    SV2 SV202 Composite Medical Procedure Identifier
    SV2 SV202-1 Product/Service ID Qualifier
    SV2 SV202-2 Product/Service ID
    SV2 SV202-3 Procedure Modifier
    SV2 202-4 Procedure Modifier
    SV2 SV202-5 Procedure Modifier
    SV2 SV202-6 Procedure Modifier
    SV2 SV202-7 Description
    SV2 SV202-8 Product/Service ID
    SV2 SV203 Monetary Amount
    SV2 SV204 Unit or Basis for Measurement Code
    SV2 SV205 Quantity
    SV2 SV206 Unit Rate
    SV2 SV209 Nursing Home Residential Status Code
    SELECT 'Product/Service ID,Product/Service ID Qual,Product/Service ID,Product Modifier,Product Modifier,Product Modifier,Product Modifier,Description,Unit Rate,'
    FROM DUAL;
    SELECT prp.revenue_iid ||','||
    prp.x12_code_list_qlfr_lkpcd ||','||
    prp.procedure_iid ||','||
    prp.mdfr_code ||','||
    prp.mdfr2_code ||','||
    prp.mdfr3_code ||','||
    prp.mdfr4_code ||','|| prp.drug_desc ||','||
    prp.srvc_line_rate ||','
    FROM pa_request_procedure prp,
    pa_request_service prs,
    pa_request pr,
    pa_transaction_request ptr,
    input_acknwldgmnt ia,
    input_batch_file ibf
    WHERE prp.pa_rqst_srvc_sid = prs.pa_rqst_srvc_sid
    AND prs.pa_rqst_sid = pr.pa_rqst_sid
    AND pr.pa_trnsctn_rqst_sid = ptr.pa_trnsctn_rqst_sid
    AND ptr.input_acknwldgmnt_sid = ia.input_acknwldgmnt_sid
    AND ia.input_batch_file_sid = ibf.input_batch_file_sid
    AND ibf.original_file_name =
    'HIPAA.165760000.20110518I001.278_GC04-3.dat';
    -- 2000F SV3
    SV301 Composite Medical Procedure Identifier
    SV301-1 Product/Service ID Qualifier
    SV301-2 Product/Service ID
    SV301-3 Procedure Modifier
    SV301-4 Procedure Modifier
    SV301-5 Procedure Modifier
    SV301-6 Procedure Modifier
    SV301-7 Description
    SV301-8 Product/Service ID
    SV302 Monetary Amount
    SV304 Oral Cavity Designation
    SV304-1 Oral Cavity Designation Code
    SV304-2 Oral Cavity Designation Code
    SV304-3 Oral Cavity Designation Code
    SV304-4 Oral Cavity Designation Code
    SV304-5 Oral Cavity Designation Code
    SV305 Prosthesis, Crown or Inlay Code
    SV306 Quantity
    SV307 Description
    select 'Product/Service ID Qual,Product/Service ID,Procedure Modifier,Procedure Modifier,Procedure Modifier,Procedure Modifier,Description,Oral Cavity Designation Code,Oral Cavity Designation Code,Oral Cavity Designation Code,Oral Cavity Designation Code,ProsthesisCrown Inlay Code,Description' from dual;
    SELECT prp.x12_code_list_qlfr_lkpcd
    || ','
    || prp.procedure_iid
    || ','
    || prp.mdfr_code
    || ','
    || prp.mdfr2_code
    || ','
    || prp.mdfr3_code
    || ','
    || prp.mdfr4_code
    || ','
    || prp.drug_desc
    || ','
    || prp.oral_cavity_dsgntn2_cid
    || ','
    || prp.oral_cavity_dsgntn3_cid
    || ','
    || prp.oral_cavity_dsgntn4_cid
    || ','
    || prp.oral_cavity_dsgntn5_cid
    || ','
    || prp.prosthesis_crown_inlay_code
    || ','
    || prp.remark
    || ','
    FROM pa_request_procedure prp,
    pa_request_service prs,
    pa_request pr,
    pa_transaction_request ptr,
    input_acknwldgmnt ia,
    input_batch_file ibf
    WHERE prp.pa_rqst_srvc_sid = prs.pa_rqst_srvc_sid
    AND prs.pa_rqst_sid = pr.pa_rqst_sid
    AND pr.pa_trnsctn_rqst_sid = ptr.pa_trnsctn_rqst_sid
    AND ptr.input_acknwldgmnt_sid = ia.input_acknwldgmnt_sid
    AND ia.input_batch_file_sid = ibf.input_batch_file_sid
    AND ibf.original_file_name =
    'HIPAA.165760000.20110518I001.278_GC04-3.dat';
    -- 2000F TOO pending Prakash to Write
    --Tooth Information
    SELECT 'Industry Code,Tooth Surface,TOO03-1_Tooth Surface Code,TOO03-2_Tooth Surface Code,TOO03-3_Tooth Surface Code,TOO03-4_Tooth Surface Code,'
    FROM DUAL;
    SELECT prp.tooth_number_cid
    || ','
    || prp.tooth_surface_cid
    || ','
    || prp.tooth_surface2_cid
    || ','
    || prp.tooth_surface3_cid
    || ','
    || prp.tooth_surface4_cid
    || ','
    || prp.tooth_surface5_cid
    || ','
    FROM pa_request_procedure prp,
    pa_request_service prs,
    pa_request pr,
    pa_transaction_request ptr,
    input_acknwldgmnt ia,
    input_batch_file ibf
    WHERE prp.pa_rqst_srvc_sid = prs.pa_rqst_srvc_sid
    AND prs.pa_rqst_sid = pr.pa_rqst_sid
    AND pr.pa_trnsctn_rqst_sid = ptr.pa_trnsctn_rqst_sid
    AND ptr.input_acknwldgmnt_sid = ia.input_acknwldgmnt_sid
    AND ia.input_batch_file_sid = ibf.input_batch_file_sid
    AND ibf.original_file_name =
    'HIPAA.165760000.20110518I001.278_GC04-3.dat';
    --tooth SItuational  Information
    select 'Monetary Amount,Quantity' from dual;
    SELECT prpt.rqst_prcdr_amt||','|| prpt.rqst_prcdr_units||','
    FROM pa_rqst_prcdr_transaction prpt,
    pa_request_procedure prp,
    pa_request_service prs,
    pa_request pr,
    pa_transaction_request ptr,
    input_acknwldgmnt ia,
    input_batch_file ibf
    WHERE prpt.pa_rqst_prcdr_sid = prp.pa_rqst_prcdr_sid
    AND prp.pa_rqst_srvc_sid = prs.pa_rqst_srvc_sid
    AND prs.pa_rqst_sid = pr.pa_rqst_sid
    AND pr.pa_trnsctn_rqst_sid = ptr.pa_trnsctn_rqst_sid
    AND ptr.input_acknwldgmnt_sid = ia.input_acknwldgmnt_sid
    AND ia.input_batch_file_sid = ibf.input_batch_file_sid
    AND ibf.original_file_name =
    'HIPAA.165760000.20110518I001.278_GC04-3.dat';
    -- pa_rqst_prcdr_x_prvdr_lctn
    select 'pa_rqst_prcdr_sid,pa_rqst_x_prvdr_lctn_sid' from dual;
    SELECT prppl.pa_rqst_prcdr_sid||','|| prppl.pa_rqst_x_prvdr_lctn_sid||','
    FROM pa_rqst_prcdr_x_prvdr_lctn prppl,
    pa_request_procedure prp,
    pa_request_service prs,
    pa_request pr,
    pa_transaction_request ptr,
    input_acknwldgmnt ia,
    input_batch_file ibf
    WHERE prppl.pa_rqst_prcdr_sid = prp.pa_rqst_prcdr_sid
    AND prp.pa_rqst_srvc_sid = prs.pa_rqst_srvc_sid
    AND prs.pa_rqst_sid = pr.pa_rqst_sid
    AND pr.pa_trnsctn_rqst_sid = ptr.pa_trnsctn_rqst_sid
    AND ptr.input_acknwldgmnt_sid = ia.input_acknwldgmnt_sid
    AND ia.input_batch_file_sid = ibf.input_batch_file_sid
    AND ibf.original_file_name =
    'HIPAA.165760000.20110518I001.278_GC04-3.dat';
    -- error Details
    --Header level error details
    SELECT 'pa_rqst_sid,pa_error_nmbr,pa_error_sid,reject_reason_lkpcd,follow_up_action_lkpcd,aaa_segment_loop_nmbr,run_nmbr,'
    FROM DUAL;
    SELECT prre.pa_rqst_sid
    || ','
    || pe.pa_error_nmbr
    || ','
    || pe.pa_error_sid
    || ','
    || ped.reject_reason_lkpcd
    || ','
    || ped.follow_up_action_lkpcd
    || ','
    || ped.aaa_segment_loop_nmbr
    || ','
    || prre.run_nmbr
    || ','
    FROM pa_request pr,
    pa_transaction_request ptr,
    input_acknwldgmnt ia,
    input_batch_file ibf,
    pa_request_run_error prre,
    pa_error pe,
    pa_error_detail ped
    WHERE pr.pa_trnsctn_rqst_sid = ptr.pa_trnsctn_rqst_sid
    AND ptr.input_acknwldgmnt_sid = ia.input_acknwldgmnt_sid
    AND ia.input_batch_file_sid = ibf.input_batch_file_sid
    AND pr.pa_rqst_sid = prre.pa_rqst_sid
    AND prre.pa_error_sid = pe.pa_error_sid
    AND pe.pa_error_sid = ped.pa_error_sid
    AND ibf.original_file_name =
    'HIPAA.165760000.20110518I001.278_GC04-3.dat';
    ---Line Level Error Details
    SELECT 'pa_rqst_sid,pa_error_nmbr,pa_error_sid,reject_reason_lkpcd,follow_up_action_lkpcd,aaa_segment_loop_nmbr'
    FROM DUAL;
    SELECT pr.pa_rqst_sid
    || ','
    || pe.pa_error_nmbr
    || ','
    || pe.pa_error_sid
    || ','
    || ped.reject_reason_lkpcd
    || ','
    || ped.follow_up_action_lkpcd
    || ','
    || ped.aaa_segment_loop_nmbr
    || ','
    FROM pa_request pr,
    pa_transaction_request ptr,
    input_acknwldgmnt ia,
    input_batch_file ibf,
    pa_request_service prs,
    pa_request_procedure prp,
    pa_request_procedure_run_error prpre,
    pa_error pe,
    pa_error_detail ped
    WHERE pr.pa_trnsctn_rqst_sid = ptr.pa_trnsctn_rqst_sid
    AND ptr.input_acknwldgmnt_sid = ia.input_acknwldgmnt_sid
    AND ia.input_batch_file_sid = ibf.input_batch_file_sid
    AND pr.pa_rqst_sid = prs.pa_rqst_sid
    AND prs.pa_rqst_srvc_sid = prp.pa_rqst_srvc_sid
    AND prp.pa_rqst_prcdr_sid = prpre.pa_rqst_prcdr_sid
    AND prpre.pa_error_sid = pe.pa_error_sid
    AND pe.pa_error_sid = ped.pa_error_sid
    AND ibf.original_file_name =
    'HIPAA.165760000.20110518I001.278_GC04-3.dat';
    SPOOL off
    SET head on
    SET feed on
    SET termout on
    SET pause on

    833560 wrote:
    Hi all,
    I am spooling 30 query results into one CSV file ,I am getting two empty rows for each query.
    Can any one suggest me how to avoid these spaces.
    Thanks & Regards,
    P Prakash
    this is the script i am using to generate csv file .
    SET linesize 12000
    SET pagesize 10000
    SET pause off
    SET termout off
    SET feed off
    SET head off
    SPOOL c:\tes111.csv replace
    SELECT 'ISA01,ISA02,ISA03,ISA04,ISA05,ISA06,ISA07,ISA08,ISA09,ISA11,ISA12,ISA13,ISA14,ISA15,ISA16,GS01,GS02,GS03,GS04,GS05,GS06,GS07,GS08,ST01,ST02,ST03,BHT01,BHT02,BHT03,BHT04,BHT06,trnsctn_segment_count,included_trnsctn_sets_count,included_fnctnl_groups_count,input_acknwldgmnt_sid'
    FROM DUAL;
    SELECT ptr.athrztn_infrmtn_qlfr
    || ','
    || ptr.athrztn_infrmtn
    || ','
    || ptr.scrty_infrmtn_qlfr
    || ','
    || ptr.scrty_infrmtn
    || ','
    || ptr.intrchng_sndr_idntfr_qlfr
    || ','
    || ptr.intrchng_sndr_idntfr
    || ','
    || ptr.intrchng_rcvr_idntfr_qlfr
    || ','
    || ptr.intrchng_rcvr_idntfr
    || ','
    || ptr.intrchng_date
    || ','
    || ptr.intrchng_cntrl_stndrds_idntfr
    || ','
    || ptr.intrchng_cntrl_vrsn_nmbr
    || ','
    || ptr.intrchng_cntrl_nmbr
    || ','
    || ptr.acknwldgmnt_rqstd_indctr
    || ','
    || ptr.usg_indctr
    || ','
    || ptr.cmpnt_elmnt_sprtr
    || ','
    || ptr.fnctnl_idntfr_code
    || ','
    || ptr.aplctn_sndr_code
    || ','
    || ptr.applctn_rcvr_code
    || ','
    || ptr.fnctnl_grp_crtn_date
    || ','
    || ptr.fnctnl_grp_crtn_date
    || ','
    || ptr.grp_cntrl_nmbr
    || ','
    || ptr.rspnsbl_agncy_code
    || ','
    || ptr.vrsn_rls_indstry_idntfr_code
    || ','
    || ptr.trnsctn_set_idntfr_code
    || ','
    || ptr.trnsctn_set_cntrl_nmbr
    || ','
    || ptr.implementation_guide_vrsn_name
    || ','
    || ptr.hierarchical_structure_code
    || ','
    || ptr.trnsctn_set_purpose_lkpcd
    || ','
    || ptr.sbmtr_trnsctn_idntfr
    || ','
    || ptr.trnsctn_set_creation_date
    || ','
    || ptr.trnsctn_type_code
    || ','
    || ptr.trnsctn_segment_count
    || ','
    || ptr.included_trnsctn_sets_count
    || ','
    || ptr.included_fnctnl_groups_count
    || ','
    || ia.input_acknwldgmnt_sid
    || ','
    FROM pa_transaction_request ptr, input_acknwldgmnt ia, input_batch_file ibf
    WHERE ptr.input_acknwldgmnt_sid = ia.input_acknwldgmnt_sid
    AND ia.input_batch_file_sid = ibf.input_batch_file_sid
    AND ibf.original_file_name =
    'HIPAA.165760000.20110518I001.278_GC04-3.dat';
    SELECT 'pa_rqst_sid,NM01,NM02,NM108,NM109'
    FROM DUAL;
    Hi ,
    Save your query in a file and execute that file
    SQL>@a.aqlHope this helps
    Regards,
    Achyut

  • Leading Spaces in CSV file while sending Email

    Hi All,
    Iam sending CSV file via email, Iam getting leading space for the first field from column second.
    I have used the below code for appending internal table.
    Concatenating all the fileds with comma separator and aded 'cl_abap_char_utilities=>newline' at the last.
    For first column iam getting correctly, form second column onwards iam getting spaces in front of first field. I have used the FM, SO_NEW_DOCUMENT_ATT_SEND_API1 for sending the email. Please suggest. thanks.

    Hi,
    Concatenate cl_abap_char_utilities=>cr_lf before every line/record of the file.
    This should resolve the issue.
    Thanks,
    Anupam

  • How to I eliminate space in my start up file so I can increase more space to add onenote?

    how to I eliminate space in my start up file so I can increase more space to add one-note?

    Go step by step and test.
    1. Start up in Safe Mode.
        http://support.apple.com/kb/PH11212
    2. Backup your computer.
    3. Empty Trash.
       http://support.apple.com/kb/PH13806
    4. Disk space / Time Machine ?/ Local Snapshots
      Local backups
       http://support.apple.com/kb/ht4878
    5. Delete old iOS Devices Backup.
        iTunes > Preferences > Devices
        Highlight the old Backups , press “Delete Backup” and then “OK”.
        http://support.apple.com/kb/HT4946?viewlocale=en_US&locale=en_US
    6. Re-index Macintosh HD.
        This will take a while. Wait until it is finished.
        System Preferences > Spotlight > Privacy
        http://support.apple.com/kb/ht2409
    7.Try OmniDiskSweeper. This will show the storage size details of the items.
       https://www.omnigroup.com/more
       Select Macintosh HD and click  “Sweep Selected Drive” at the bottom.
       Delete the files you don’t want to keep.
       Be careful. Delete only the files that can be safely  deleted. If you are not sure about any file, don’t touch it.
    8. Move iTunes, iPhoto and iMovie media folders to an external drive.
        iTunes
        http://support.apple.com/en-us/HT201562
        iPhoto
        http://support.apple.com/kb/PH2506
        iMovie
        http://support.apple.com/kb/ph2289

  • File Adapter: trailing space in field using XSD:Decimal in a CSV file

    Hi Folks,
    I have a problem which i am unable to understand fully.
    We have a SAP to file via XI scenario where a mail adapter is used. We are producing a CSV file as an mail attachment.
    The issue is all the decimal fields have an extra space before the next delimiter i.e. comma.
    The data type used in mapping is XSD:Decimal with 'fraction' set as 2.
    I have checked the source XML and there is no trailing spaces there. Initially i thought this might be due to doing the conversion using transformation beans in mail adapter, to rule this out i checked other files produced using FILE adapter they also appear to have same issue.
    I can't get my head around it, could not find any parameter i need to pass in content conversion either in MAIL adapter or FILE adapter to supress trailing space before the delimiter.
    I suppose this must have occured with others as well.
    Any directions would be greatly appreciated.
    Btw, we are on PI 7.0 and ECC 6
    -Praveen
    Edited by: - External Consultants Mouchel on Sep 15, 2009 5:34 PM

    Hi Mouchel,
    I personally didnot encounter this kind of issue with the file adapter at any point of time. I would suggest you to check the message mapping before and after payload in sxmb_moni. If you see in mapping you may not find out, so view the source in notepad and then see.
    Regards,
    ---Satish

  • How to eliminate spaces while reading a file in BPEL Process

    Hi All,
    How to eliminate space when reading a file which of fixedlength and inserting into database.
    Inserting some of columns sucessfully but there is a column where there is a space n front the value, how to eliminate the that space.
    I have a custom XSD and using out in the process. How to resolve this issue to get all rows sucessfully inserted into the database.
    Anyone hod gone through this kind of functionality.
    Regards,
    CH

    Hi,
    try to use XPath functions like 'normalize-space()', 'right-trim()', 'left-trim()' in your transformation.
    Regards,
    Martin.

  • SQLLOADER fails to load space from the CSV file

    Hello,
    -- I am using sqlloader to load tables(Release 10.2.0.4.0)
    --- I have actual spaces in my CSV files, which i want to load as it is in the tables.
    --- However, sqlloader loads "nulls" instead.
    --- Please help.

    Vishal,
    Use this and you should be able do this, I am attaching a working example based your input data
    Table: Layout
    create table layout (
    col1 varchar2(10),
    col2 varchar2(100),
    col3 varchar2(10)
    );# Control file: layout.ctl
    LOAD DATA
    REPLACE PRESERVE BLANKS INTO TABLE KLONDIKE.LAYOUT
    FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
    TRAILING NULLCOLS
    col1 char(10),
    col2 char(100),
    col3 char(10)
    {code}
    Loading from command line
    {code}
    C:. sqlldr username/password control=layout.ctl data=layout.txt log=layout.log
    {code}
    Verfiy using following sql that data loaded with one white single whitespace
    {code}
    select layout.* , length(col2) from layout;
    {code}
    Regards
    Edited by: OrionNet on Mar 28, 2009 12:09 PM                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • Import csv file in Address Spaces in an Exchange 2007 Send Connector

    hello , i must put more than 300 domains in the addres space of a Send connector.
    is possible have a csv file with the 300 domains and a powershell script to import this file in the address space of one send connector?
    example csv file :
    cepsa.es
    repsol.com
    parsi.es
    Regards
    Thansk in advance
    mcse 200x + mesaging 2000 2003 2007 2010

    Hi
    At First, you CSV should be set as the format like
    Name
    cepsa.es
    repsol.com
    parsi.es
    If you would like to set a new Send Connector. you can simply do
    New-SendConnector -Name ConnectName -AddressSpace ((Import-CSV <PathOfCSV>) | ForEach {$_.Name})
    If you would like to add to a Send Connector that already existed, Please run
    $al = (Get-SendConnector -Identity <ConnectName>).AddressSpaces
    $al += (Import-CSV <PathOfCSV>) | ForEach {$_.Name})
    Set-SendConnector -Name ConnectName -AddressSpace $al
    Cheers
    Zi Feng
    Zi Feng
    TechNet Community Support
    The first script is still working as it should under Exchange 2013 when a send connector is created for the first time.
    The second part of adding (or removing)  address spaces from an existing send connector was a little bit trickier.
    the following script did it:
    Get-SendConnector "ConnectorName" | Set-SendConnector  -AddressSpace ((Import-CSV <PathOfCSV>) | ForEach {$_.Name})
    Watch out! this command also removes domains which are not present in the csv file!

  • Report Total Wrapping/Missing Data in CSV FIle

    PROBLEM:
    We have an application were the totals in the report region will wrap when the total is negative and formatted with a negative sign preceding the number (e.g. -43,567.99). The wrapping results in users being confused to wether or not a value is negative or positive. We want all non-numeric columns to wrap so that the user does not have to scroll horizontally.
    SOLUTIONS TRIED
    1. Set the CCS Style attribute of a column to white-space:nowrap.
    The value of CCS Style is inserted into span tags associated with a column value which in turn elliminates the wrapping in the detailed area of the report. However, I have no found a way to insert this type of span tag to the Total of a report region.
    2. Modifying the format mask to present negative numbers in brackets (e.g. <43,567.99>).
    This solves the wrapping issue . . . however results in a problem when outputting the report to a CSV file. When outputting to a CSV file the negative number that have been formatted using bracket are not included in the output. I believe that it interprets them as html tags <> and therefore eliminates them from the output.
    3. Create duplicate amount columns in the report and apply a number format that places a negative sign in front of negative numbers and make this column display conditionally for CSV output only Then change the original column format mask to use brackets. Although this will work it seem a bit clunky, results in unnecessary pull of excess data and will require a lot of re-work/re-testing of our system.
    REQUEST
    Does anyone have any ideas on how I might either:
    1. Add white-space:nowrap to the totals of the report region
    2. Overcome the exclusion of negative numbers containing brackets from the CSV output.
    3. HAve another approach to resolving this wrapping issue.
    Thanks,
    David

    According to this article http://www.cs.tut.fi/~jkorpela/html/nobr.html this is a known wrapping issue with Internet Explorer. Wrapping will occure when the following characters exists -()[]{}«»%°·\/!?. The author of the article suggest that the only way around this issue is too place -a or use white-space:nowrap in a [td] or [tr] tag.
    This would suugest that I need to find a way to add html to the total column in the htmldb report . . . which I don't believe I can do . . . Does anyone know of a way to insert html into these total columns similiar to how we can be done using the CCS Style or HTML Expression attributes.
    Thanks,
    David.

  • How to import csv-file in Numbers 3.2.2.

    I start using Numbers in stead of Excel. I would like to import csv-files from my bank, but when I open the csv-file in Numbers, everything is imported in the same cell. I composed a testfile: 01/08/2014,”text”,”more text”,”even more text” in Pages, exported to a textfile and changed the extension from .txt in .csv. It did not help, everything was in the same cell. What must be changed to become successful in importing csv-files? I am using Numbers 3.2.2. and an iMac with 2,8 GHz Intel Core i7 processor and 8 GB 1067 MHz DDR3 memory with OS X 10.9.4.
    Thanks, Joan Voormolen

    You can do this using Pages. Without using outside scripts or functions. The Pages Find/Replace function will let you change the delimiter on the data in your file.
    Open the file in Pages. Click Show Invisibles. (this will show you the delimiter used in the file)
    If you see a * as the delimiter, that is a space. Some data files are space delimited. This is a really poor way to delimit numerical data files.
    If you see a fat arrow to the right, the file is Tab delimited
    Obviously, a comma is not a hidden character. Some files are comma delimited
    Whatever else might have been used as a delimiter (for example a semi colon is sometimes used) will be apparent.
    The delimiter should be something that is not used anywhere else in the "data"... text, numbers, etc., you want to delimit. Numbers considers a comma as a valid delimiter for files with the suffix .csv . It considers a tab as a valid delimiter with files with the suffix .txt . It does not consider spaces a valid delimiter in with any file suffix. But some programs use odd delimiters (semi colon, colon, double spaces, etc) as delimiters.
    Use the Find command, then Find/Replace as you need to create that delimiter numbers recognizes. Let's say a semi colon was used as a delimiter. Enter the current delimiter (semi colon)  into the Find box. Pages should highlight all the instances of your entry. Enter a comma (to create comma delimited data file) in the Replace box. You should now see a comma as the delimiter.
    Important Don't forget, any other comma used in the file will also be considered a delimiter. (a comma in 1,000 for example). So check the data. If you see a comma used another way you will want to eliminate that BEFORE you do the "comma as delimiter" replacement. If you have 1,000, do a find/replace with comma as the find, nothing as the replace, first. THEN do the replacement of the semi colon.
    Now comes the "tricky" part from what I could see. You want to save this new file with a suffix of .csv. (Export the file) Numbers will only open a comma delimited file with separated data (by comma) if it's suffix is .csv. Pages only gives you limited export options and puts the file suffix on for you automatically. CSV is not one of the options!
    Choose Text. Pages will name the file .txt. Quit Pages. Go to the file on your desktop (or wherever you saved it). Change the file suffiix from .txt to .csv.
    That's it. Open the file with Numbers. Numbers will create a separate column for everything between the comma's.
    You can use this same method to alter your data file before you import it into Numbers. For example, one file I wanted to import had time=xxx . I only wanted the actual time, not the text attached to it, in my spreadsheet. I did a find/replace with "time=" as the find. A comma as the replace. Even though "time=xxx" is one "word", Pages identified the "time=" within the word to allow the replacement.
    Numbers does not provide a "choose delimiter" function when opening a file. Instead it automatically uses the standard delimiter based on the file suffix. CSV means Comma, so if the file is named .csv it will only look for and use a comma as the delimiter to put the data into separate columns. I believe .txt uses only a tab as the delimiter. In the above example you could find/replace to a Tab. Then Export to Text. And numbers will open the data into columns the way you want, without the extra step of renaming the file on your desktop.
    While some files use a second space (ie two in a row) as a delimiter that's a nasty way to delimit. You always want a specific delimiter that is not used within the data element.
    The above is to import numerical data into separate columns. You could use the same method to manipulate a file that contains text. Let's say you had a file with the suffix .txt. In the file are names and addresses.  John Smith 246 Rose Road . You want Name in one column. Address in another.  Look at all the spaces, which ones should be delimiters which not? Are there any delimiters in the file?
    If you open with Pages and choose show invisibles you can see. You might see John Smith --> 246 Rose Road. (the --> will look like a fat arrow in Pages). Numbers will open this file, IF it has .txt as the suffix, based on the Tab,  with name in one column, Address in another.
    Or you might see John*Smith**246*Rose*Road. Even though the creator of this intended two spaces to be a delimiter Numbers does not recognize that. Numbers will put everything into one column. The fix? In Pages, put a tab between name and address. Find/replace two spaces with Tab. Export, as Text.
    Based on what you see (with show invisible active) in Pages, you can use the Find/Replace function to create the specific delimiter you want (tab or comma). You can use that function to manipulate the file easily so the data you want shows up in separate columns. You may need to get clever to accomplish the unique delimiters. You might even need to do two passes with Find/Replace.
    In the instance above  if there was only one space between each element. (not two as a pseudo delimiter) You could replace all spaces with a tab in Pages. Export as Text.  Numbers will open that file with a column for each word (one for John, one for Smith). Then  "Merge" the two cells (columns) you want to put back together. 

  • Issue in conversion of output file from alv to csv file using GUI_DOWNLOAD

    hi,
    I am using GUI_DOWNLOAD to convert the internal table that am getting as the output of an alv into a csv(comma separated file) file.I am using the following code but its not generating a csv file instead it is generating a normal space delimited file.
    The code is as follows:
    data : lv_fname type string.
    lv_fname = 'C:\Users\pratyusha_tripathi\Desktop\status8.csv'. " Provide the file path & file name with CSV extention
    CALL FUNCTION 'GUI_DOWNLOAD'
    EXPORTING
    filename = lv_fname " File name including path, give CSV as extention of the file
    FILETYPE = 'DAT'
    WRITE_FIELD_SEPARATOR = '#' " Provide comma as separator
    tables
    data_tab = ITAB " Pass the Output internal table
    FIELDNAMES =
    EXCEPTIONS
    OTHERS = 22
    IF sy-subrc 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
    WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    Kindly let me know what changes can be made to make my code work.Also can GUI_download be used for batch processing and storing the output in application server?
    Thanks ,
    Pratyusha

    Hi,
    the short text description for WRITE_FIELD_SEPARATOR is "Separate Columns by Tabs in Case of ASCII Download", so why do you expect a comma?
    Try SAP_CONVERT_TO_CSV_FORMAT and then download.
    And no, GUI_DOWNLOAD is only for download via SAP GUI to a users computer.
    Best regards,
    Oliver

  • How can I make an easy *.CSV file to load into database table

    Hi All,
    I have a huge excel sheet having columns item#, description and qty. The description column sometimes maybe one word name, two word name separated with space or may be , spearated name. I want to write and PL/SQl code which will read this file and load it into database table. Now the *.CSV file is either comma delimited or tab text delmited which both do not solve my issue. Is there any better solution with anyone which can prevent the manual editing to the *.CSV file and I can easily load it to table.
    Your help is appreciated,
    Thanks
    Zahir

    SQL*Loader is probably the fastest method, but since you specifically asked for a PL/SQL method:
    http://asktom.oracle.com/pls/ask/f?p=4950:8:::::F4950_P8_DISPLAYID:464420312302

  • Display data from CSV file in iWeb page

    Hi,
    I like to display data from a CSV file in iWeb page if a date value from CSV file matches todays value from the system. Here is an example.
    CSV data values
    01/20/2011,Sunny,87
    01/21/2011,Cloudy,100
    01/22/2011,Rainy,60
    If today's date value is 01/21/2011 the page should display 01/21/2011 Cloudy 100 in a tabular format.
    Appreciate your help in providing HTML code for this issue.
    Thanks

    I suspect there is a soft return in the excel database somewhere that can't be seen. Take the csv/txt file into notepad and look for a line that starts oddly compared to the others.
    I haven't had luck removing soft returns from excel files so I do this a rather odd way. I take the excel file into InDesign as a table, and then use find/change to replace any soft returns with nothing, then convert the text to table and then export the text out again by going export, and selecting text from the dropdown menu.
    For my money, I always save tab delimited text files from excel so that if a field does contain commas, it doesn't "trick" indesign into thinking a new field is beginning or not... instead the field delimiters are tabs and they are unlikely to have been used in the excel database.
    If you do choose to use this indesign import method of mine to clean up the database, i also noticed two things in your screengrab: first was that some fields have spaces at the start of the text... easy enough to fix with a GREP that looks for ^\s (start of a sentence followed by a space) and replace with nothing. The second thing is the T&C field that all entries (at least in the screengrab) all start the same – if all entries in the database start the same, couldn't that line be in the indesign file? Its only a small detail I know.

  • Delete a .csv file from desktop system

    Hi All,
    My requirement is to read the .csv file from the desktop system having the shared folder and delete the file after read successfully.
    Here I can read the .csv file from the location using the function RFC_REMOTE_FILE and updated the content into internal table.
    But I cant delete the file from the presentation server ( Desktop system).
    Can anyone tell me how to delete the .csv file from the desktop system on different location.
    Note:
    I followed this link to read file:
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/9831750a-0801-0010-1d9e-f8c64efb2bd2&overridelayout=true

    Hi Rob,
    Thanks. I solved this problem myself.
    The solution to delete the file from remote system is
    concatenate 'DEL' i_filename i_dirname into v_bkfile separated by space .
    call function 'RFC_REMOTE_EXEC'
      destination  c_dest
      exporting
        command               = v_bkfile
      exceptions
        system_failure        = 1  MESSAGE v_ermsg
        communication_failure = 2  MESSAGE v_ermsg.

Maybe you are looking for

  • RFC returns different data of SE37

    I have a RFC that has a table of exit with two fileds of value (Gross Price and Tax Value). In the transaction SE37 returns the two fileds correctly. In the Web Portal (not SAP) when access the RFC with the same data, returns de same amount records,

  • I can not open an Excel Spreadsheet

    I purchased my MacPro last November. I loaded NUMBERS on my macnine as well as PAGES, and KEYNOTE. I was determined to learn these programs and not rely on using MS OFFICE on my Mac. The rest of my office is still using PC/s and therefore when I am s

  • Flex 2-14 issues

    Dear Users, I strongly advise you not to consider buying the Lenovo model Flex 2-14 or any Lenovo machines at all. I bought one recently and the screen has yellow spots in the corners and on the sides from day 1. Also the laptop rattles thanks to its

  • Is Essbase essential for Hyperion 11.1.2 ???

    Hi Gurus, I know this is a stupid question, as Essbase is the database for Hyperion, but is there any way or doc to prove that without Esssbase Hyperion installation is incomplete ot that its mandatory. Any inputs on this is appreciated. Regards, Uda

  • People photos on a white background....

    Hi I'm looking for a bit of advice on the following: I need to print 200 profile cards (sized A6).  There will be information about the individual along with a photograph.  My questions are: The photos are all 680 x 1024 at 72dpi - I'm sure these wil