Handling CSV files from Crystal 10

<p>I&#39;m encountering some difficulty with a report with a CSV file as input.Crystal is automatically interpreting a column to have a numeric data type instead of text. Any ideas how I can force Crystal to interpret it differently?</p><p> Thanks a lot.</p>

Hi, thanks for the reply. I'm connecting using ADO for Access/Excel. Also tried using ODBC for text file. Column  number 9 ('Sec. ID num') is the column in question.
Here's the contents of a sample file (it's not too small unfortunately)
Record Type,Portfolio Number,Purs/Sale,Trans. Number,Settlement Date,Trade Date,Asset Type,Sec. ID Type,Sec. ID Num,Sec Name,Euroclear Ind,Nominal,Quoted Ccy,Trade Price,Consideration,Comm,Misc. Comm,Inc/Pch Sold,Sett Ccy,Sett Cash,Trd Broker,Trd Broker Nm,Trd Brok Acc #,Sett Broker,Sett Broker Nm,Sett Broker Acc#,Comments,Trade Src ID,Cancel Trade,Rel Trans ID, Port Num,Dim Trans Num,Header 1,Header 2,Header 3,Header 4,Header 5
DET,AAAA,P,3072779,20070126,20070123,EQ,SD,AAAABBB,Stock A,,1000,EUR,10,10000,100.0000000,0.0000000,0.00,EUR,10100.00,MLILGB3LESF,MERRIL-02,,BKAUATWW,,,,9644303,N,0,A-SUMMBAL,20070124000002,HDR,9644303,16184,20050201,161849
DET,AAAA,P,3072779,20070126,20070123,EQ,SD,4943111,Stock A,,1000,EUR,10,10000,100.0000000,0.0000000,0.00,EUR,10100.00,MLILGB3LESF,MERRIL-02,,BKAUATWW,,,,9644303,N,0,A-SUMMBAL,20070124000002,HDR,9644303,16184,20070201,161849
DET,AAAA,S,3072780,20070126,20070123,EQ,SD,5541979,Stock A,,1000,EUR,10,10000,100.0000000,0.0000000,0.00,EUR,9900.00,SBILGB2L,SALBRO-01,,CITTGB2L,,,,9644303,N,0,A-SUMMBAL,20070124000003,HDR,9644303,16184,20070201,161849
DET,AAAA,P,3072781,20070126,20070123,EQ,SD,2676302,Stock A,,1000,CAD,10,10000,100.0000000,0.0000000,0.00,CAD,10100.00,SLIIGB2L,LEHBRO-01,T12702711,ROYCCAT2,,/ADEP/RBCT,,9644303,N,0,A-SUMIGLOB,20070124000004,HDR,9644303,16184,20070201,161849
DET,AAAA,P,3072782,20070126,20070123,EQ,SD,7103065,Stock A,,1000,CHF,10,10000,100.0000000,0.0000000,0.00,CHF,10100.00,MLILGB3LESF,MERRIL-02,,MLILGB3LESF,,/ADEP/GB101073,,9644303,N,0,A-SUMIGLOB,20070124000005,HDR,9644303,16184,20070201,161849
DET,AAAA,P,3072783,20070126,20070123,EQ,SD,5086577,Stock A,,1000,EUR,10,10000,100.0000000,0.0000000,0.00,EUR,10100.00,MLILGB3LESF,MERRIL-02,6500440000,PARBDEFF,,/ADEP/0007259,,9644303,N,0,A-SUMIBAL,20070124000006,HDR,9644303,16184,20070201,161849
DET,AAAA,P,3072784,20070126,20070123,EQ,SD,B01CP21,Stock A,,1000,EUR,10,10000,100.0000000,0.0000000,0.00,EUR,10100.00,MLILGB3LESF,MERRIL-02,,PARBESM,,,,9644303,N,0,A-SUMMTECH,20070124000007,HDR,9644303,16184,20070201,161849
DET,AAAA,S,3072785,20070126,20070123,EQ,SD,5051252,Stock A,,1000,EUR,10,10000,100.0000000,0.0000000,0.00,EUR,9900.00,MLILGB3LESF,MERRIL-02,,NDEAFIHH,,,,9644303,N,0,A-SUMMTECH,20070124000008,HDR,9644303,16184,20070201,161849
DET,AAAA,P,3072786,20070126,20070123,EQ,SD,7088429,Stock A,,1000,EUR,10,10000,100.0000000,0.0000000,0.00,EUR,10100.00,ABNAGB22,ABNAMR-03,23667,ABNAGB22,,/ADEP/00611,,9644303,N,0,A-SUMMBAL,20070124000009,HDR,9644303,16184,20070201,161849
DET,AAAA,P,3072787,20070126,20070123,EQ,SD,B00G0S5,Stock A,,1000,HKD,10,10000,100.0000000,110.0000000,0.00,HKD,10210.00,MLPFUS31XXX,MERRIL-02,000000438280,MLFEHKHH,,,,9644303,N,0,A-SUMIGLOB,20070124000010,HDR,9644303,16184,20070201,161849
DET,AAAA,P,3072788,20070126,20070123,EQ,SD,5013832,Stock A,,1000,EUR,10,10000,100.0000000,101.2500000,0.00,EUR,10201.25,DAVYIE21,DAVBRO-01,,DAVYIE21,,/CRST/189,,9644303,N,0,A-SUMMBAL,20070124000011,HDR,9644303,16184,20070201,161849
DET,AAAA,P,3072789,20070126,20070123,EQ,SD,2569286,Stock A,,1000,USD,10,10000,100.0000000,0.0000000,0.00,USD,10100.00,MLPFUS31XXX,MERRIL-02,,MLPFUS31,,/ADTC/161,,9644303,N,0,A-SUMIGLOB,20070124000012,HDR,9644303,16184,20070201,161849
DET,AAAA,P,3072790,20070126,20070123,EQ,SD,B0DJNG0,Stock A,,1000,EUR,10,10000,100.0000000,0.0000000,0.00,EUR,10100.00,SBILGB2L,SALBRO-01,,CITIITMX,,,,9644303,N,0,A-SUMMGROW,20070124000013,HDR,9644303,16184,20070201,161849
DET,AAAA,S,3072791,20070126,20070123,EQ,SD,6435145,Stock A,,1000,JPY,10,10000,100.0000000,0.0000000,0.00,JPY,9900.00,MLPFUS31XXX,MERRIL-02,,MLCOJPJT,,,,9644303,N,0,A-SUMIGLOB,20070124000014,HDR,9644303,16184,20070201,161849
DET,AAAA,P,3072792,20070126,20070123,EQ,SD,4942818,Stock A,,1000,USD,10,10000,100.0000000,0.0000000,0.00,USD,10100.00,MSNYUS33,MORGAN-02,,MSNYUS33,,/ADTC/050,,9644303,N,0,A-SUMIGLOB,20070124000015,HDR,9644303,16184,20070201,161849
DET,AAAA,S,3072793,20070126,20070123,EQ,SD,5727360,Stock A,,1000,EUR,10,10000,100.0000000,0.0000000,0.00,EUR,9900.00,MSLNGB2X,MORGAN-01,47612G,PARBFRPPNLC,,/ADEP/019,,9644303,N,0,A-SUMMGROW,20070124000016,HDR,9644303,16184,20070201,161849
DET,AAAA,P,3072794,20070126,20070123,EQ,SD,B11HK39,Stock A,,1000,NOK,10,10000,0.0000000,0.0000000,0.00,NOK,10000.00,MIDLGB22JAC,HSBCGP-01,,ESSENOKX,,,,9644303,N,0,A-SUMIBAL,20070124000017,HDR,9644303,16184,20070201,161849
DET,AAAA,P,3072795,20070126,20070123,EQ,SD,5959378,Stock A,,1000,SEK,10,10000,100.0000000,0.0000000,0.00,SEK,10100.00,MLILGB3LESF,MERRIL-02,,ESSESESS,,,,9644303,N,
0,A-SUMMTECH,20070124000018,HDR,9644303,16184,20070201,161849
DET,AAAA,P,3072796,20070126,20070123,EQ,SD,6916844,Stock A,,100,SGD,10,1000,100.0000000,0.0000000,0.00,SGD,1100.00,MLPFUS31XXX,MERRIL-02,,MLSSSGSG,,,,9644303,N,0,A-SUMIGLOB,20070124000019,HDR,9644303,16184,20070201,161849
DET,AAAA,P,3072797,20070126,20070123,EQ,SD,2113382,Stock A,,1000,USD,10,10000,100.0000000,0.0000000,0.00,USD,10100.00,MLPFUS31XXX,MERRIL-02,,MLPFUS31,,/ADTC/161,,9644303,N,0,A-SUMMTECH,20070124000020,HDR,9644303,16184,20070201,161849
DET,AAAA,P,3072798,20070126,20070123,EQ,SD,3134865,Stock A,,1000,GBP,10,10000,100.0000000,101.0000000,0.00,GBP,10201.00,CITIGB2L,SANBER-01,,CITIGB2L,,/CRST/899,,9644303,N,0,A-SUMMGROW,20070124000021,HDR,9644303,16184,20070201,161849
DET,AAAA,P,3072799,20070126,20070123,EQ,SD,2890005,Stock A,,1000,USD,10,10000,100.0000000,0.0000000,0.00,USD,10100.00,MLPFUS31XXX,MERRIL-02,,MLPFUS31,,/ADTC/161,,9644303,N,0,A-SUMMTECH,20070124000022,HDR,9644303,16184,20070201,161849
DET,AAAA,P,3072815,20070126,20070123,EQ,SD,2890005,Stock A,,1000,USD,11,11000,101.0000000,0.0000000,0.00,USD,11101.00,MLPFUS31XXX,MERRIL-02,,MLPFUS31,,/ADTC/161,,9644303,Y,3072799,A-SUMMTECH,20070124000022,HDR,9644303,16184,20070201,161849
DET,AAAA,P,3072800,20070126,20070123,FI,SD,7624641,Stock A,Y,1000,EUR,100,1000,0.0000000,0.0000000,21.16,EUR,1021.16,JPMSGB2L,CHAINV-04,,JPMSGB2L,,/AEID/95724,,9644303,N,0,A-SUMMGROW,20070124000023,HDR,9644303,16184,20070201,161849
DET,AAAA,S,3072801,20070126,20070123,FI,SD,7624641,Stock A,Y,1000,EUR,100,1000,0.0000000,0.0000000,21.16,EUR,1021.16,BARCGB33,BARCLY-03,,BARCGB33,,/ACID/34797,,9644303,N,0,A-SUMMGROW,20070124000024,HDR,9644303,16184,20070201,161849
DET,AAAA,P,3072802,20070126,20070123,FI,SD,3258486,Stock A,Y,1000,EUR,100,1000,0.0000000,0.0000000,25.20,EUR,1025.20,RBOSGB2RTCM,RBSGRP-01,,RBOSGB2RTCM,,/AEID/97802,,9644303,N,0,A-SUMMGROW,20070124000025,HDR,9644303,16184,20070201,161849
DET,AAAA,P,3072810,20070126,20070123,FI,SD,3258486,Stock A,Y,1000,EUR,100,1000,0.0000000,0.0000000,25.20,EUR,1025.20,RBOSGB2RTCM,RBSGRP-01,,RBOSGB2RTCM,,/AEID/97802,,9644303,Y,3072802,A-SUMMGROW,20070124000025,HDR,9644303,16184,20070201,161849
DET,AAAA,P,3072803,20070126,20070123,FI,SD,B0LNX64,Stock A,,1000,GBP,100,1000,0.0000000,0.0000000,16.42,GBP,1016.42,BARCGB33,BARCLY-03,,BARCGB33,,/CRST/034,,9644303,N,0,A-SUMMBAL,20070124000026,HDR,9644303,16184,20070201,161849
DET,AAAA,P,3072804,20070126,20070123,FI,SD,B06FWG8,Stock A,Y,1000,EUR,100,1000,0.0000000,0.0000000,15.78,EUR,1015.78,DEUTGB2L,DEUTSC-03,,DEUTGB2L,,/AEID/91255,,9644303,N,0,A-SUMMGROW,20070124000027,HDR,9644303,16184,20070201,161849
TLR,28,219738.17,9644303,,,,,,,,,,,,,,,,,,,,,,,,,,,,,HDR,9644303,16184,20070201,161849

Similar Messages

  • Reading a CSV file from server

    Hi All,
    I am reading a CSV file from server and my internal table has only one field with lenght 200. In the input CSV file there are more than one column and while splitting the file my internal table should have same number of rows as columns of the input record.
    But when i do that the last field in the internal table is appened with #.
    Can somebody tell me the solution for this.
    U can see the my code below.
    data: begin of itab_infile occurs 0,
             input(3000),
          end of itab_infile.
    data: begin of itab_rec occurs 0,
             record(200),
          end of itab_rec.
    data: c_comma(1) value ',',
            open dataset f_name1 for input in text mode encoding default.
            if sy-subrc <> 0.
              write: /, 'FILE NOT FOUND'.
              exit.
            endif.
    do
      read dataset p_ipath into waf_infile.
      split itab_infile-input at c_sep into table itab_rec.
    enddo.
    Thanks in advance.
    Sunil

    Sunil,
    You go not mention the platform on which the CSV file was created and the platform on which it is read.
    A common problem with CSV files created on MS/Windows platforms and read on unix is the end-of-record (EOR) characters.
    MS/Windows usings <CR><LF> as the EOR
    Unix using either <CR> or <LF>
    If on unix open the file using vi in a telnet session to confirm the EOR type.
    The fix options.
    1) Before opening the opening the file in your ABAP program run the unix command dos2unix.
    2) Transfer the file from the MS/Windows platform to unix using FTP using ascii not bin.  This does the dos2unix conversion on the fly.
    3) Install SAMBA and share the load directory to the windows platforms.  SAMBA also handles the dos2unix and unix2dos conversions on the fly.
    Hope this helps
    David Cooper

  • Accessing CSV File from URL

    Hi Experts,
    I developing an interface where it need access an CSV file from an URL.
    The url is: http://200.218.208.119/download/fechamento/20100413.csv
    The file is modified every day, then yesterday this file will be: 20100414.csv
    My interface generate the following erros
      <Trace level="1" type="T">---- Plain HTTP Adapter Outbound----</Trace>
      <Trace level="1" type="T">---------------------------------------------</Trace>
    - <Trace level="1" type="B" name="CL_HTTP_PLAIN_OUTBOUND-ENTER_PLSRV">
      <Trace level="3" type="T">Quality of Service BE</Trace>
      <Trace level="1" type="T">Get XML-Dokument from the Message-Objekt</Trace>
      <Trace level="3" type="T">URL http://200.218.208.119:80/download/fechamento/20100413.csv</Trace>
      <Trace level="3" type="T">Proxy Host:</Trace>
      <Trace level="3" type="T">Proxy Service:</Trace>
      <Trace level="3" type="T">~request_method POST</Trace>
      <Trace level="3" type="T">~server_protocol HTTP/1.0</Trace>
      <Trace level="3" type="T">accept: */*</Trace>
      <Trace level="3" type="T">msgguid: A7031081480011DFA526001517D1434C</Trace>
      <Trace level="3" type="T">service: D0B_100</Trace>
      <Trace level="3" type="T">interface namespace: http://bcb.gov.br/xi/OB02</Trace>
      <Trace level="3" type="T">interface name: MI_RFC_OUT</Trace>
      <Trace level="3" type="T">Header-Fields</Trace>
      <Trace level="3" type="T">Prolog conversion Codepage: UTF-8</Trace>
      <Trace level="3" type="T">Epilog conversion Codepage: UTF-8</Trace>
      <Trace level="3" type="T">content-length 133</Trace>
      <Trace level="3" type="T">content-type: text/csv; charset=UTF-8</Trace>
      <Trace level="2" type="T">HTTP-Response :</Trace>
      <Trace level="1" type="T">Method Not Allowed</Trace>
      <Trace level="2" type="T">Code : 405</Trace>
      <Trace level="2" type="T">Reason: Method Not Allowed</Trace>
      <Trace level="2" type="T">HTTP-response content-length 6261</Trace>
    can I help?

    > I developing an interface where it need access an CSV file from an URL.
    This is not supported in PI standard.
    I think the option will be available in PI 7.3, but I cannot promise it.

  • Delete a .csv file from desktop system

    Hi All,
    My requirement is to read the .csv file from the desktop system having the shared folder and delete the file after read successfully.
    Here I can read the .csv file from the location using the function RFC_REMOTE_FILE and updated the content into internal table.
    But I cant delete the file from the presentation server ( Desktop system).
    Can anyone tell me how to delete the .csv file from the desktop system on different location.
    Note:
    I followed this link to read file:
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/9831750a-0801-0010-1d9e-f8c64efb2bd2&overridelayout=true

    Hi Rob,
    Thanks. I solved this problem myself.
    The solution to delete the file from remote system is
    concatenate 'DEL' i_filename i_dirname into v_bkfile separated by space .
    call function 'RFC_REMOTE_EXEC'
      destination  c_dest
      exporting
        command               = v_bkfile
      exceptions
        system_failure        = 1  MESSAGE v_ermsg
        communication_failure = 2  MESSAGE v_ermsg.

  • Error loading csv file from application server

    Hi all,
    While uploading a csv file from the application server to psa we are getting the following error,
    Error 2 while splitting CSV data record
    Message no. RSDS_ACCESS011
    Diagnosis
    Error 2 occurred while splitting the CSV data record 1
    1 = Could not find a closing escape character
    2 = Invalid escape character
    3 = Conversion error
    4 = Other error
    System Response
    The function was terminated.
    Procedure
    Check the values of the data separator and escape sign, and try again.
    But i've checked the file and the escape sign, data seperator in it also. Everything is fine.  The same file we are able to load successfully in quality system.
    How to solve this error??
    Thanks in advance.

    Hi BI consultant:
       Could you please provide more details?
    For example:
    1.Is your P application server a UNIX flavor? (Solaris, AIX, UX, Linux)
       If yes..
             2. Are you able to see the contents of the file correctly with a "cat" or "vi" command? (at operating system level).
                   If no...
                         3. Did you upload the csv flat file to the server via FTP?
                                If yes...
                                     4. Did you use the "binary" or the "ascii" parameter on the FTP command used to upload the file?
    Probably you need to upload the CSV file again to your application server and make sure you can se the file contents ("cat" or "vi" command) before trying to execute the InfoPackage.
    Regards,
    Francisco Milán.
    Edited by: Francisco Milan on Jun 3, 2010 11:13 AM

  • Download CSV-File from Interactive Report

    Dear all,
    I want to download a CSV-File from a Interactive report.
    The content of the Rows in Excel:
    Erste,"B","C","D","E","F","G","H","I","J","K","L","M","N","O","P"
    IBC ->,"-","1","2","3","4","5","6","7","8","9","10","11","12","13","14" [...]
    How can I solve this problem?
    Thanks for help.

    Your CSV-File use "comma" as separator. But Excel think, that CSV-File separator is "semicolon".
    Try to set "CSV Separator"=";" in section "Report Attributes" - "Download".

  • Stored Proc to create .csv file from table

    Hi all,
    I have a table with 4 columns and 10 rows. Now I need a stored proc to create a .csv file from the table data. Here the scripts to create a table and insert the row's.
    Create table emp(emp_id number(10), emp_name varchar2(20), department varchar2(20), salary number(10));
    Insert into emp values ('1','AAAAA','SALES','10000');
    Insert into emp values ('2','BBBBB','MARKETING','8000');
    Insert into emp values ('3','CCCCC','SALES','12000');
    Insert into emp values ('4','DDDDD','FINANCE','10000');
    Insert into emp values ('5','EEEEE','SALES','11000');
    Insert into emp values ('6','FFFFF','MANAGER','90000');
    Insert into emp values ('7','GGGGG','SALES','12000');
    Insert into emp values ('8','HHHHH','FINANCE','14000');
    Insert into emp values ('9','IIIII','SALES','20000');
    Insert into emp values ('10','JJJJJ','FINANCE','21000');
    commit;
    Now I need a stored proc to create a .csv file in my local location. Please let me know If you need any other details....

    Some pointers:
    http://www.oracle-base.com/articles/9i/GeneratingCSVFiles.php
    http://tkyte.blogspot.com/2009/10/httpasktomoraclecomtkyteflat.html
    also, doing a search on this forum or http://asktom.oracle.com will give you many clues.
    .csv file in my local location.What is your 'local location'?
    A client machine? The database server machine?
    What database version are you using?
    (the result of: select * from v$version; )

  • SSIS 2008 – Read roughly 50 CSV files from a folder, create SQL table from them dynamically, and dump data.

    Hello everyone,
    I’ve been assigned one requirement wherein I would like to read around 50 CSV files from a specified folder.
    In step 1 I would like to create schema for this files, meaning take the CSV file one by one and create SQL table for it, if it does not exist at destination.
    In step 2 I would like to append the data of these 50 CSV files into respective table.
    In step 3 I would like to purge data older than a given date.
    Please note, the data in these CSV files would be very bulky, I would like to know the best way to insert bulky data into SQL table.
    Also, in some of the CSV files, there will be 4 rows at the top of the file which have the header details/header rows.
    According to my knowledge I would be asked to implement this on SSIS 2008 but I’m not 100% sure for it.
    So, please feel free to provide multiple approaches if we can achieve these requirements elegantly in newer versions like SSIS 2012.
    Any help would be much appreciated.
    Thanks,
    Ankit
    Thanks, <b>Ankit Shah</b> <hr> Inkey Solutions, India. <hr> Microsoft Certified Business Management Solutions Professionals <hr> http://ankit.inkeysolutions.com

    Hello Harry and Aamir,
    Thank you for the responses.
    @Aamir, thank you for sharing the link, yes I'm going to use Script task to read header columns of CSV files, preparing one SSIS variable which will be having SQL script to create the required table with if exists condition inside script task itself.
    I will be having "Execute SQL task" following the script task. And this will create the actual table for a CSV.
    Both these components will be inside a for each loop container and execute all 50 CSV files one by one.
    Some points to be clarified,
    1. In the bunch of these 50 CSV files there will be some exception for which we first need to purge the tables and then insert the data. Meaning for 2 files out of 50, we need to first clean the tables and then perform data insert, while for the rest 48
    files, they should be appended on daily basis.
    Can you please advise what is the best way to achieve this requirement? Where should we configure such exceptional cases for the package?
    2. For some of the CSV files we would be having more than one file with the same name. Like out of 50 the 2nd file is divided into 10 different CSV files. so in total we're having 60 files wherein the 10 out of 60 have repeated file names. How can we manage
    this criteria within the same loop, do we need to do one more for each looping inside the parent one, what is the best way to achieve this requirement?
    3. There will be another package, which will be used to purge data for the SQL tables. Meaning unlike the above package, this package will not run on daily basis. At some point we would like these 50 tables to be purged with older than criteria, say remove
    data older than 1st Jan 2015. what is the best way to achieve this requirement?
    Please know, I'm very new in SSIS world and would like to develop these packages for client using best package development practices.
    Any help would be greatly appreciated.
    Thanks, <b>Ankit Shah</b> <hr> Inkey Solutions, India. <hr> Microsoft Certified Business Management Solutions Professionals <hr> http://ankit.inkeysolutions.com
    1. In the bunch of these 50 CSV files there will be some exception for which we first need to purge the tables and then insert the data. Meaning for 2 files out of 50, we need to first clean the tables and then perform
    data insert, while for the rest 48 files, they should be appended on daily basis.
    Can you please advise what is the best way to achieve this requirement? Where should we configure such exceptional cases for the package?
    How can you identify these files? Is it based on file name or are there some info in the file which indicates
    that it required a purge? If yes you can pick this information during file name or file data parsing step and set a boolean variable. Then in control flow have a conditional precedence constraint which will check the boolean variable and if set it will execute
    a execte sql task to do the purge (you can use TRUNCATE TABLE or DELETE FROM TableName statements)
    2. For some of the CSV files we would be having more than one file with the same name. Like out of 50 the 2nd file is divided into 10 different CSV files. so in total we're having 60 files wherein the 10 out of 60 have
    repeated file names. How can we manage this criteria within the same loop, do we need to do one more for each looping inside the parent one, what is the best way to achieve this requirement?
    The best way to achieve this is to append a sequential value to filename (may be timestamp) and then process
    them in sequence. This can be done prior to main loop so that you can use same loop to process these duplicate filenames also. The best thing would be to use file creation date attribute value so that it gets processed in the right sequence. You can use a
    script task to get this for each file as below
    http://microsoft-ssis.blogspot.com/2011/03/get-file-properties-with-ssis.html
    3. There will be another package, which will be used to purge data for the SQL tables. Meaning unlike the above package, this package will not run on daily basis. At some point we would like these 50 tables to be purged
    with older than criteria, say remove data older than 1st Jan 2015. what is the best way to achieve this requirement?
    You can use a SQL script for this. Just call a sql procedure
    with a single parameter called @Date and then write logic like below
    CREATE PROC PurgeTableData
    @CutOffDate datetime
    AS
    DELETE FROM Table1 WHERE DateField < @CutOffDate;
    DELETE FROM Table2 WHERE DateField < @CutOffDate;
    DELETE FROM Table3 WHERE DateField < @CutOffDate;
    GO
    @CutOffDate which denote date from which older data have to be purged
    You can then schedule this SP in a sql agent job to get executed based on your required frequency
    Please Mark This As Answer if it solved your issue
    Please Vote This As Helpful if it helps to solve your issue
    Visakh
    My Wiki User Page
    My MSDN Page
    My Personal Blog
    My Facebook Page

  • Can Photoshop Elements version 13 handle raw files from Canon camera 7D mark 2? Where can I read about it on Adobe web site?

    Can Photoshop Elements version 13 handle raw files from Canon camera 7D mark 2?
    Where can I read about it on Adobe web site?
    I found an Adobe web site note that raw plug-in version 8.7 will support Canon 7D mark 2, but is not included in Elements version 11.

    Camera Raw plug-in | Supported cameras
    Camera Raw-compatible Adobe applications

  • Lightroom 4.  dose not handle RAW files from canon 5D MK 111.

    I have purchased lightroom 4. It dose not handle RAW files from canon 5D MK 111.
    I have downloaded and ran the suggested file from Adobe, but still no luck.
    Anyone have the answer?

    The initial release of Lightroom 4 doesn't handle the 5d mk 3 files because of when the camera was released.  You can either wait for the 4.1 release, or you can download the 4.1 Release Candidate 2 at http://labs.adobe.com/technologies/lightroom4-1/.
    Technically you can also download Adobe's DNG converter, but that just adds another program to the situation.

  • How do I get CS5 Bridge/Photoshp to handle RAW files from my D800E?

    How do I get CS5 Bridge/Photoshop to handle RAW files from my D~800E?

    There are many who would say, based only on empiric data (boo) and not opinion (yay), that the most accurate raw conversions would be obtained using the Nikon software and not secondary processors like the Adobe converter. The DNG process makes irreversible changes to the raw data which may or may not matter but the DNG is not the same as the original raw file.
    You can do the experiment yourself, for FREE, and compare different methods of raw conversion. You may be surprised at what you see. You have to be because different converters tune to different image characteristics. Don't be taken in about claims Adobe makes for its convertor without proving them to yourself. Adobe has made deliberate pre-set choices for its converter and they may not be the best for you. Or they may be, as we have all become so used to what Adobe chooses we think we chose it ourselves.
    If I were able to afford, and had the energy to lug around a D800e, I might not want to compromise the image quality I just shelled out such massive bucks to obtain using the Adobe converter without proving that was the best option for me, but I don't do it for a living. Also, if you can shell out for a D800e and are enamored of the Adobe Converter realize it only costs $10/month for a CC subscription, possibly tax deductible, and you can still retain your older version of PS. The CC version of PS allows use of the ACR as a filter layer, if you are fond of the converter tools (which are much improved in the latest version).

  • How to create .csv file from ABAP report

    Hi
    We have a requirement to generate .csv file from abap report.
    Currently user saves data from abap report to spreadsheet(.xls format) in desktop.  Then opens excel file and save as .csv format.  Need option to save directly in .csv format instead of .xls format.
    Please let me know, if there is any standard function module available to create .csv file.
    Regards
    Uma

    I tried with your code it's going to dump
    REPORT ZTEMP101 message-id 00.
    tables: lfa1.
    types: begin of t_lfa1,
          lifnr like lfa1-lifnr,
          name1 like lfa1-name1,
          end of t_lfa1.
    data: i_lfa1 type standard table of t_lfa1,
          wa_lfa1 type t_lfa1.
    types truxs_t_text_data(4096) type c occurs 0.
    data: csv_converted_table type table of TRUXS_T_TEXT_DATA.
    select-options: s_lifnr for lfa1-lifnr.
    select lifnr name1 from lfa1 into table i_lfa1
           where lifnr in s_lifnr.
    CALL FUNCTION 'SAP_CONVERT_TO_CSV_FORMAT'
    EXPORTING
       I_FIELD_SEPERATOR          = ';'
             I_LINE_HEADER              =
             I_FILENAME                 =
             I_APPL_KEEP                = ' '
      TABLES
        I_TAB_SAP_DATA             = I_LFA1
    CHANGING
       I_TAB_CONVERTED_DATA       = csv_converted_table
    EXCEPTIONS
       CONVERSION_FAILED          = 1
       OTHERS                     = 2
    IF SY-SUBRC <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    CALL FUNCTION 'WS_DOWNLOAD'
    EXPORTING
      BIN_FILESIZE                  = ' '
      CODEPAGE                      = ' '
       FILENAME                      =
       'C:\Documents and Settings\ps12\Desktop\Test folder\exl.cvs'
      FILETYPE                      = 'DAT'
      MODE                          = ' '
      WK1_N_FORMAT                  = ' '
      WK1_N_SIZE                    = ' '
      WK1_T_FORMAT                  = ' '
      WK1_T_SIZE                    = ' '
      COL_SELECT                    = ' '
      COL_SELECTMASK                = ' '
      NO_AUTH_CHECK                 = ' '
    IMPORTING
      FILELENGTH                    =
      TABLES
        DATA_TAB                      = csv_converted_table
      FIELDNAMES                    =
    EXCEPTIONS
      FILE_OPEN_ERROR               = 1
      FILE_WRITE_ERROR              = 2
      INVALID_FILESIZE              = 3
      INVALID_TYPE                  = 4
      NO_BATCH                      = 5
      UNKNOWN_ERROR                 = 6
      INVALID_TABLE_WIDTH           = 7
      GUI_REFUSE_FILETRANSFER       = 8
      CUSTOMER_ERROR                = 9
      OTHERS                        = 10
    IF SY-SUBRC <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    my version is 4.6c

  • Can we send .csv file from sap srm system to sap pi?

    Hi Experts,
    we have 3 options send the data from sap systems to sap pi.i. e.proxy,idoc and rfc only
    How can we send .csv file from sap srm to sap pi?
    Regards,
    Anjan

    Anjan
    As you know SAP SRM and SAP PI are different boxes.
    *_Option 1:_*
    we need a shared AL11 directory in between SAP SRM and SAP PI (Ask basis to setup shared folder). Place / Populate the file in the folder from SAP SRM and then it can be picked through sender file communication channel.
    In this case you (Basis team) will share one folder which is visible from the AL11 transaction of both the systems (SRM and PI). You will drop .csv file using some report or program from SRM at this location and from PI you can read that file using File communication channel (NFS mode).
    Option 2:
    Setup a FTP at SRM environment and expose some folder which can be accessible from PI. Use sender file communication channel at PI end to pick the file.
    You can use this option incase sharing of folder is not possible (due to network / other constrains). Here FTP server is required to expose any folder as FTP so as it can be accessible from internet (remote location). You need to expose some folder at SRM machine.  You will drop .csv file using some report or program from SRM at this location. Now PI can fetch the file from that location using  sender file communication channel (FTP Mode) providing user credentials.
    Hope it clears now.
    Regards
    Raj

  • Copy CSV file from windows to linux

    Our oracle is installed on UNIX machine so we have to place CSV file there.
    We are on windows.We are using Putty client to access UNIX server.
    how to copy CSV file from windows to UNIX server?
    Thanks

    vis1985 wrote:
    Our oracle is installed on UNIX machine so we have to place CSV file there.
    We are on windows.We are using Putty client to access UNIX server.
    how to copy CSV file from windows to UNIX server?Using scp (secure copy).
    It is supported by Putty and ssh services are available by default on most (if not all) modern Unix servers.
    Don't have a Windows machine close by, but I recall the scp command is supported by the Putty executable called "+pscp.exe+".
    The syntax is very simple:
    scp <from_destination> <to_destination>
    example (pushing a file from Windows to Oracle Unix server)
      scp c:\datafiles\accounts.csv [email protected]:/u01/files/accounts.csv This will prompt for a password.
    You also can use ssh trusted certificates (RSA or DSA keys) to automate this. Using Putty you can generate a key pair that consists of a private key and public key.
    You copy and paste the public key into the destination server's +$HOME/.ssh/authorized_keys+ file. This results the server in trusting that client and allows the client to connect (using ssh, scp and sftp) without having to supply a password (as the private part of the key serves as authentication). (I would however not use the oracle account for this, but a special scp-copy usage Unix account that does not support shell usage)
    If you want to automate it the other way around - have the Unix server pull the file from the Windows client, then you need to install server software on that Windows machine to allow it to provide such a file copy service.
    Again I suggest using ssh - OpenSSH is available for a variety of platforms, and Windows should be included. If not, there should be ssh service for Windows from Microsoft or another vendor that can be used.
    Any other method than ssh will be either insecure or problematic. FTP is insecure as usernames and passwords are transmitted in clear text. Using Windows File Sharing requires Samba to be loaded and configured on the Unix server - and this is extra work installing and configuring and testing s/w that uses a low level security, that is also proprietary and could required setting on the Windows client also to make it work in a somewhat robust fashion.
    The standard today is ssh. It is robust and secure. And it makes a lot of sense to use it.. and pretty good motivation for not using it ifo something else.

  • Inserting Record into CSV file from BizTalk Orchestration

    Scenario:
    1.Receive file from Source system via RecvPipeline
    2.In Orchestration  extracting some values like ENO,Ename,Salary etc.these values to be added in to CSV file from Expression Shape.How to append/add emp records in to CSV with out overriding the rows.
    Ex:If we submitted 10 files then the CSV file should contain 10 rows in CSV.
    Let me know how to create CSV file from Orchestration and how to add rows into that csv value
    Regards BizTalkWorship

    Simple.
    Receive the message through a Receive Port/Location.
    Create a flat-file schema representing the CSV file structure. Ensure each row is delimited by “{CR}{LF}”. 
    This flat-file schema should only contain the element which you want to see in the destination CSV file like ENO,Ename,Salary etc.
    Have a map where the source schema should be the one which represents the received file and destination schema should be the one which is above created flat-file schema.
    Map the source schema to the destination schema mapping the filed 
    ENO,Ename,Salary etc.
    Have a custom send pipeline with flat-file assembler component it. Use this send pipeline in the send port.
    In send port, configure the send filter like “BTS.ReceivePortName == YourReceivePortName”. Configure the send port’s “Outbound Maps” to the map which you have created in
    above step
    Key Point. In your send port, set the “Copy Mode” property to “Append” from default “Create New”
    With your send port’s, “Copy Mode” property configured to “Append” this will append the value of the output to the existing file. Since in your flat-file schema, each record
    is delimited by “{CR}{LF}” and since you’re overwriting the output file you will have one file with records appended. So if 10 files received, instead of 10 output files, you will have 1 CVS file with 10 rows.
    If you want to construct the message in Orchestration as do, you do as opposed to map in send port at outbound map you can still do.
    If this answers your question please mark it accordingly. If this post is helpful, please vote as helpful by clicking the upward arrow mark next to my reply.

Maybe you are looking for

  • IBASE & Component in Change Request & Ugent correction

    Hi, I have configured CHARM for following landscape details. Dev-100 QAS-200 Prod-300 For this I have defined the logical component as 100->200->300 & This is working fine. I mention the IBase & Component for Target system i.e. 300 client & Proceed f

  • Transferring account from one MacBook Pro to a newer one

    Hi, we just bought a brand new MacBook Pro (non-retina) and we want to transfer just one account from our older MacBook Pro. What's the best way to go about this? What kind of firewire do we need? Our current MBP is 2.26 GHz Intel Core 2 Duo, using O

  • Using SOA for CRM --- ECC communication

    Hi, We have the following standard CRM ECC update scenarios Condition Updates Master data sync up etc. We have ECC 6.0 and CRM 7.0. We are trying to evaluate a scnario in which we can replace the CRM middleware by SOA services provided by SAP. Are th

  • Screen enhancement for tab Material Master of CV02N

    Hi experts . I have a requirement to modify tab for material master in object links (transaction CV02N) . Currently, there are only 2 fields-material & description. The requirement wants to add 2 coloumn,  AEOI-REVLV &a checkbox for user selection to

  • Why Adobe not sponsor (FreeHand)?

    I'm a senior graphic designer and I work in a cosmetics company. I work using many Graphic software like (Adobe photoshop, InDesign, Audition, Illustrator and 3Ds max, 3D Maya ... etc.) on Mac OS and Windows. My problem that I was work using (Macrome