Splitting csv file

Hi,
I have a procedure which stores my retrieved records into a csv file using the mime type. I want to split my resulting file into 2 depending on the records retrieved. Is this possible? Please reply asap.
Thanks in advance
Bharat

the point being there are 2 delimiters to split, the first being the comma the second being the line break thus creating an array of single digits
I have solved the issue by using replaceall to replace all commas with spaces and then split on the spaceStill makes no sense. A line break doesn't not contain spaces. So how does replacing commas with spaces allow you to split on the line break.
If you are appending each line to a string (to build one long string) then append the data with a comma, not a space.
I really get the idea you guys are not enjoying my code?Its your requirements we don't understand. You obviously aren't explaining them correctly.

Similar Messages

  • Splitting CSV file; output a simple tokenised string of doubles - HELP

    Okay here it goes? I have a file that contains comma delimited double values in the form;
    2,3
    2.8,9.0
    4.8,9.0
    I am trying to get the values in to the a simple string of "2 3 2.8 9.0 4.8 9.0". Therefore i have tried to replace all commas with a space and then split on whitespace? no success there?
    Th other way is to split on white space string.split("\\s+") and creat a string [] array and then loop through each element of the array creating a multi dimensional array of seperated values. These i could then concatente to form a string?
    My question is where is my code going wrong for spliting the string???
    package LineGraph;
    import java.util.*;
    import java.io.*;
    public class ReadData extends Thread {
         private Cdf cdf = null;
         private String fileName = null;
         String rawData = "";
         double[][] data = creatData();
         static double[][] creatData() {
              String rawData = GUI.jta.getText();
                System.out.println("rawData: " + rawData);
                String dataSet = rawData.replaceAll(",","\\s+"); 
                String [] point = null;
                point = rawData.split("\\s+");
                for (int i=0; i < point.length; i++){
                     System.out.println("point:" + point);
              //for (int i = 0; i < dataSet.length; i++){
              //System.out.println("dataSet: " + dataSet[i]);
              //String [] point = null;
              //if (dataSet.length > 0) {
              // point = dataSet[0]; // start with the first element
              // for (int i=1; i<dataSet.length; i++) {
              // point = dataSet[i];
              //System.out.println("point:" + point);
              //String[] CoOrd = point.split(",");
              String result = "";
              if (point.length > 0) {
                   result = point[0];
                   for (int j=1; j<point.length; j++) {
                        result = point[j];
              StringTokenizer tokens = new StringTokenizer(result);
              List list = new LinkedList();
              while (tokens.hasMoreElements()){
                   String number = "";
                   String token = tokens.nextToken();
                   for (int i=0; i<token.length(); i++){
                        if (Character.isDefined(token.charAt(i))){
                             number += token.substring(i, i+1);
              if (!number.equalsIgnoreCase("")){
                   boolean add = list.add(number);
         System.out.println("list:" + list);
              double [][]data = new double[list.size()/2][2];
              int index = -2;
              for (int i=0; i<data.length;i++){
                        index += 2;
                        data[i][0] = Double.parseDouble(
                                  (list.get(index).toString()));
                        data[i][1] = Double.parseDouble(
                                  (list.get(index +1).toString()));
              System.out.println("data.length: " + data.length); //** add this -- see how much data is being created
              return data;
              //cdf = new Cdf(data, fileName, PrintMessage);
         public Cdf getCdf(){
              return this.cdf;

    the point being there are 2 delimiters to split, the first being the comma the second being the line break thus creating an array of single digits
    I have solved the issue by using replaceall to replace all commas with spaces and then split on the spaceStill makes no sense. A line break doesn't not contain spaces. So how does replacing commas with spaces allow you to split on the line break.
    If you are appending each line to a string (to build one long string) then append the data with a comma, not a space.
    I really get the idea you guys are not enjoying my code?Its your requirements we don't understand. You obviously aren't explaining them correctly.

  • Split records into Multiple csv files using a Threshold percentage

    Hi Gurus,
    I have a requirement to split the data into two csv file from a table using a threshold value(in Percentage) .
    Assume that If my source select query of interface fetches 2000 records , I will provide a threshold value like 20%.
    I need to generate a csv1 with 400 records(20% of 2000) and the rest of the records into another csv2.
    For implementing this I am trying to use the following process.
    1) Create a procedure with the select query to get the count of records.
    Total Records count: select count(1) from source_table <Joins> <Lookups> <Conditions>;
    2) Calculate the Record count to first CSV using the threshold_value.
    CSV1_Count=Total records count /threshold_value
    3) Create a view that fetches the CSV1_Count(400) records for CSV1 as follows.
    Create view CSV1_view as select Col1,Col2,Col3 from source_table <Joins> <Lookups> <Conditions>
    Where rownum<=CSV1_Count;
    4) Generate CSV1 file using View 'CSV1_View'
    5) Generate CSV2 File using the Interface with same select statement (with columns ) to generate a CSV.
    select Col1,Col2,Col3 from source_table ST <Joins> <Lookups> <Conditions>
    Left outer join (Select Col1 from CSV1_View ) CS on CS.Col1=ST.Col1 where CS.Col1 is null;
    Which gives the Total records minus the CS1_View records.
    The above process seems a bit complex and very simple . If any changes in my Interface I also need to change the procedure (counts the no:of records).
    Please provide your comments and feedback about this and looking for your inputs for any new simple approach or fine tune the above approach.
    Thanks,
    Arjun

    Arjun,
    This are my thoughts and Lets do it in 3 Steps
    Step 1.  ODI Procedure
    Drop table Temp_20 ;
    Create table Temp_20 as select * from table where rownum < ( SELECT TRUNC( COUNT(1) /5) FROM TABLE ) .
    [ ** This way iam fetching approx 20% of the table data and loading into Temp table . 1/5 th is 20%  so i am dividing count by 5
    I don't believe View will help you especially with RowNum as if you run the same query with rownum < N the rows order might differ . so Temp table is great ]
    Step 2 .  Use OdiSqlUnload  with select columns  from temp_20
    Step 3 . Use again OdiSqlUnload  with  select columns from table where  ( uk keys ) not in ( selecy uk_keys from temp_20)
    [** this way you can pick the remaining 80% ** and the data will be not repeat itself across 20% and 80% , as might happen with view ]
    what do you think ?

  • How to avoid the split problem when uploading the data from csv file

    Dear Friends,
                  I have to upload data from the .csv file to my custom table , i have found a problem when uploading the data .
    i am using the code as below , please suggest me what i have to do in this regard
          SPLIT wa_raw_csv  AT ',' INTO
                    wa_empdata_csv-status
                     wa_empdata_csv-userid
                     wa_empdata_csv-username
                     wa_empdata_csv-Title
                     wa_empdata_csv-department.
    APPEND wa_empdata_csv TO  itab.
    in the flat file i can see for one of the record for the field Title  as
    Director, Finance - NAR............there by through my code the  wa_empdata_csv-Title is getting splited data as "Director, and  Department field is getting Finance - NAR" , i can see that even though  " Director, Finance - NAR"  is one word it is getting split .
    .......which is the problem iam facing.Please could any body let me know how in this case i should handle in my code that this word
    "Director,Finance - NAR"   wil not be split into two words.
    Thanks & Records
    Madhuri

    Hi Madhuri,
    Best way to avoid such problem is to use TAB delimited file instead of comma separated data. Generally TAB does not appear in data.
    If you are generating the file, use tab instead of comma.
    If you cannot modify the format of file and data length in file is fixed character, you will need to define the structure and then move data in fixed length structure.
    Regards,
    Mohaiyuddin

  • How To Split Large Excel or CSV Files into Smaller Files

    Does anyone know how to split a large Excel or CSV file into multiple smaller files?  Or, is there an app that will work with Mac to do that?

    split [-a suffix_length] [-b byte_count[k|m]] [-l line_count] [-p pattern] [file [name]]
    is a native Terminal command. Read up more on https://developer.apple.com/library/mac/documentation/Darwin/Reference/ManPages/ man1/split.1.html
    I prefer to use gSplit which is gnu coreutils.
    You can install gnu coreutils using homebrew.

  • Split a CSV file into separate lines

    Hi,
    does anyone know how to split the contents of a .CSV file into separate files, one for each line?
    Sample "file.csv" contains:
    file.csv contains:
    file1,22,cat,33,cmyk,new,
    file2,22,dog,45,spot,old,
    file3,22,mouse,50,cmyk,new,
    need this output:
    cat.csv
    containing:
    file1,22,cat,33,cmyk,new,
    dog.csv
    containing:
    file2,22,dog,45,spot,old,
    mouse.csv
    containing:
    file3,22,mouse,50,cmyk,new,
    The naming of the files using a certain field would be nice but not critical, I would be happy to just have "file_1.csv, file_2.csv"

    Use:
    while read line        
    do        
        filename=$(echo $line | cut -f3 -d , ).csv
        echo $line > ~/Desktop/"$filename"        
    done < "$1"

  • How do I split a Large CSV file into Multiple CSV's Using Powershell

    I am a novice at powershell but this looks to be the best tool to do this task. have a csv file that looks like this:
    Date,Policy,Application
    10/13/2014,No,None
    10/13/2014,No,None
    10/13/2014,No,None
    10/13/2014,No,None
    10/13/2014,No,None
    11/14/2013,Yes,AppBiz
    11/14/2013,Yes,AppBiz
    11/14/2013,Yes,AppBiz
    07/04/2013,No,PeopleBiz
    07/04/2013,No,PeopleBiz
    07/04/2013,No,PeopleBiz
    07/04/2013,No,PeopleBiz
    Is it possible to split this CSV into multiple CSV's based on "Application".
    Lets say the output might look like:
    None.csv
    10/13/2014,No,None
    10/13/2014,No,None
    10/13/2014,No,None
    10/13/2014,No,None
    10/13/2014,No,None
    AppBiz.csv
    11/14/2013,Yes,AppBiz
    11/14/2013,Yes,AppBiz
    11/14/2013,Yes,AppBiz
    PeopleBiz.csv
    07/04/2013,No,PeopleBiz
    07/04/2013,No,PeopleBiz
    07/04/2013,No,PeopleBiz
    07/04/2013,No,PeopleBiz
    Any help would be greatly appreciated

    I think this might be what you want:
    Import-Csv applications.csv |
    Group Application |
    foreach {
    $_.Group | Export-Csv "$($_.Name).csv" -NoTypeInformation
    [string](0..33|%{[char][int](46+("686552495351636652556262185355647068516270555358646562655775 0645570").substring(($_*2),2))})-replace " "
    Very nice! 4x faster..
    I doubt the OP will get what you just did there..
    Sam Boutros, Senior Consultant, Software Logic, KOP, PA http://superwidgets.wordpress.com (Please take a moment to Vote as Helpful and/or Mark as Answer, where applicable) _________________________________________________________________________________
    Powershell: Learn it before it's an emergency http://technet.microsoft.com/en-us/scriptcenter/powershell.aspx http://technet.microsoft.com/en-us/scriptcenter/dd793612.aspx

  • Splitting of a CSV File with Multiple Records into Multiple XML File

    Dear All,
    <b> I am doing a Scenario of CSV to XML Files. I am using BPM for the same. My incoming CSV File has got multiple records. I want to break this Multiple records into Multiple XML Files having one record each.</b>
    Can someone suggest how can I break this rather Split this into Multiple XML Files.
    Is Multimapping absoltely necesaary for this. Can't we do this without Multimapping. Can we have some workaround in the FCC parameters that we use in the Integration Directory.
    Kindly reply ASAP. Thanks a lot to all in anticipation.
    Pls Help.
    Best Regards
    Chakra and Somnath

    Dear All,
    I am trying to do the Multimapping, and have 0....unbounded also. Someways it is not working.
    <b>
    Smitha please tell me one thing...Assigning the Recordsets per Message to 1, does it mean that it will write multiple XML Files as I want.</b>
    Also I am usinf Set to Read only. So once the File is read it becomes RA from A. Then will it write the other Records.
    I have to use a BPM because there are certain dependencies that are there for the entire Process Flow. I cannot do without a BPM.
    Awaiting a reply. Thanks a lot in anticipation.
    Best Regards
    Chakra and Somnath

  • Read a CSV file and dynamically generate the insert

    I have a requirement where there are multiple csv's which needs to be exported to a sql table. So far, I am able to read the csv file and generate the insert statement dynamically for selected columns however, the insert statement when passed as a parameter
    to the $cmd.CommandText
    does not evaluate the values
    How to evaluate the string in powershell
    Import-Csv -Path $FileName.FullName | % {
    # Insert statement.
    $insert = "INSERT INTO $Tablename ($ReqColumns) Values ('"
    $valCols='';
    $DataCols='';
    $lists = $ReqColumns.split(",");
    foreach($l in $lists)
    $valCols= $valCols + '$($_.'+$l+')'','''
    #Generate the values statement
    $DataCols=($DataCols+$valCols+')').replace(",')","");
    $insertStr =@("INSERT INTO $Tablename ($ReqColumns) Values ('$($DataCols))")
    #The above statement generate the following insert statement
    #INSERT INTO TMP_APPLE_EXPORT (PRODUCT_ID,QTY_SOLD,QTY_AVAILABLE) Values (' $($_.PRODUCT_ID)','$($_.QTY_SOLD)','$($_.QTY_AVAILABLE)' )
    $cmd.CommandText = $insertStr #does not evaluate the values
    #If the same statement is passed as below then it execute successfully
    #$cmd.CommandText = "INSERT INTO TMP_APL_EXPORT (PRODUCT_ID,QTY_SOLD,QTY_AVAILABLE) Values (' $($_.PRODUCT_ID)','$($_.QTY_SOLD)','$($_.QTY_AVAILABLE)' )"
    #Execute Query
    $cmd.ExecuteNonQuery() | Out-Null
    jyeragi

    Hi Jyeragi,
    To convert the data to the SQL table format, please try this function out-sql:
    out-sql Powershell function - export pipeline contents to a new SQL Server table
    If I have any misunderstanding, please let me know.
    If you have any feedback on our support, please click here.
    Best Regards,
    Anna
    TechNet Community Support

  • Error loading csv file from application server

    Hi all,
    While uploading a csv file from the application server to psa we are getting the following error,
    Error 2 while splitting CSV data record
    Message no. RSDS_ACCESS011
    Diagnosis
    Error 2 occurred while splitting the CSV data record 1
    1 = Could not find a closing escape character
    2 = Invalid escape character
    3 = Conversion error
    4 = Other error
    System Response
    The function was terminated.
    Procedure
    Check the values of the data separator and escape sign, and try again.
    But i've checked the file and the escape sign, data seperator in it also. Everything is fine.  The same file we are able to load successfully in quality system.
    How to solve this error??
    Thanks in advance.

    Hi BI consultant:
       Could you please provide more details?
    For example:
    1.Is your P application server a UNIX flavor? (Solaris, AIX, UX, Linux)
       If yes..
             2. Are you able to see the contents of the file correctly with a "cat" or "vi" command? (at operating system level).
                   If no...
                         3. Did you upload the csv flat file to the server via FTP?
                                If yes...
                                     4. Did you use the "binary" or the "ascii" parameter on the FTP command used to upload the file?
    Probably you need to upload the CSV file again to your application server and make sure you can se the file contents ("cat" or "vi" command) before trying to execute the InfoPackage.
    Regards,
    Francisco Milán.
    Edited by: Francisco Milan on Jun 3, 2010 11:13 AM

  • Is there a way to open CSV files with more than 255 columns?

    I have a CSV file with more than 255 columns of data.  It's a fairly standard export of social media data that shows volume of posts by day for the past year, from which I can analyze the data and publish customized charts. Very easy in Excel but I'm hitting the Numbers limit of 255 columns per table. Is there a way to work around the limitation? Perhaps splitting the CSV in two? The data shows up in the CSV file when I open via TextEdit, so it's there. Just can't access it in Numbers. And it's not very usable/useful for me in TextEdit.
    Regards,
    Tim

    You might be better off with Excel. Even if you could find a way to easily split the CSV file into two tables, it would be two tables when you want only one.  You said you want to make charts from this data.  While a series on a chart can be constructed from data in two different tables, to do so takes a few extra steps for each series on the chart.
    For a test to see if you want to proceed, make two small tables with data spanning the tables and make a chart from that data.  Make the chart the normal way using the data in the first table then repeat the following steps for each series
    Select the series in the chart
    Go to Format sidebar
    Click in the "Value" box
    Add a comma then select the data for this series from the second chart
    Press Return
    If there is an easier way to do this, maybe someone else will chime in with that info.

  • Reading a CSV file from server

    Hi All,
    I am reading a CSV file from server and my internal table has only one field with lenght 200. In the input CSV file there are more than one column and while splitting the file my internal table should have same number of rows as columns of the input record.
    But when i do that the last field in the internal table is appened with #.
    Can somebody tell me the solution for this.
    U can see the my code below.
    data: begin of itab_infile occurs 0,
             input(3000),
          end of itab_infile.
    data: begin of itab_rec occurs 0,
             record(200),
          end of itab_rec.
    data: c_comma(1) value ',',
            open dataset f_name1 for input in text mode encoding default.
            if sy-subrc <> 0.
              write: /, 'FILE NOT FOUND'.
              exit.
            endif.
    do
      read dataset p_ipath into waf_infile.
      split itab_infile-input at c_sep into table itab_rec.
    enddo.
    Thanks in advance.
    Sunil

    Sunil,
    You go not mention the platform on which the CSV file was created and the platform on which it is read.
    A common problem with CSV files created on MS/Windows platforms and read on unix is the end-of-record (EOR) characters.
    MS/Windows usings <CR><LF> as the EOR
    Unix using either <CR> or <LF>
    If on unix open the file using vi in a telnet session to confirm the EOR type.
    The fix options.
    1) Before opening the opening the file in your ABAP program run the unix command dos2unix.
    2) Transfer the file from the MS/Windows platform to unix using FTP using ascii not bin.  This does the dos2unix conversion on the fly.
    3) Install SAMBA and share the load directory to the windows platforms.  SAMBA also handles the dos2unix and unix2dos conversions on the fly.
    Hope this helps
    David Cooper

  • How to upload .CSV file from Application Server

    Hi Experts,
        How to upload .CSV file separated by ',' from Application server to an internal table.
    Invoice No,Cust No,Item Type,Invoice Date,days,Discount Amount,Gross Amount,Sales Amount,Customer Order No.,Group,Pay Terms
    546162,3233,1,9/4/2007,11,26.79,5358.75,5358.75,11264,HRS,11
    546163,2645,1,9/4/2007,11,3.07,305.25,305.25,10781,C,11
    Actually I read some already answered posts. But still I have some doubts.
    Can anybody please send me the code.
    Thanks in Advance.

    Hi Priya,
    Check this code
    Yhe logic used here is as follows,
    Get all the data into an internal table in the simple format ie: a row with one field contains an entire line
    After getting the data, we split each line of the table on every occurrence of the delimiter (comma in your case)
    Here, I have named the fields as field01, field02 etc, you could use your own names according to your requirement
    parameters: p_file(512).
      DATA : BEGIN OF ITAB OCCURS 0,
              COL1(1024) TYPE C,
             END OF ITAB,
             WA_ITAB LIKE LINE OF ITAB.
      DATA: BEGIN OF ITAB_2 OCCURS 0,
        FIELD01(256),
        FIELD02(256),
        FIELD03(256),
        FIELD04(256),
        FIELD05(256),
        FIELD06(256),
        FIELD07(256),
        FIELD08(256),
        FIELD09(256),
        FIELD10(256),
        FIELD11(256),
        FIELD12(256),
        FIELD13(256),
        FIELD14(256),
        FIELD15(256),
        FIELD16(256),
       END OF ITAB_2.
      DATA: WA_2 LIKE LINE OF ITAB_2.
        OPEN DATASET p_FILE FOR INPUT IN TEXT MODE ENCODING NON-UNICODE.
        IF SY-SUBRC = 8.
          WRITE:/ 'File' , p_FILE , 'cannot be opened'.
          LV_LEAVEPGM = 'X'.
          EXIT.
        ENDIF.
        WHILE SY-SUBRC <> 4.
          READ DATASET p_FILE INTO WA_ITAB.
          APPEND WA_ITAB TO ITAB.
        ENDWHILE.
        CLOSE DATASET p_FILE.
      LOOP AT ITAB INTO WA_ITAB.
        SPLIT WA_ITAB-COL1 AT ','    " where comma is ur demiliter
         INTO WA_2-FIELD01 WA_2-FIELD02 WA_2-FIELD03 WA_2-FIELD04
         WA_2-FIELD05 WA_2-FIELD06 WA_2-FIELD07 WA_2-FIELD08 WA_2-FIELD09
         WA_2-FIELD10 WA_2-FIELD11 WA_2-FIELD12 WA_2-FIELD13 WA_2-FIELD14
         WA_2-FIELD15 WA_2-FIELD16.
        APPEND WA_2 TO ITAB_2.
        CLEAR WA_2.
      ENDLOOP.
    Message was edited by:
            Kris Donald

  • How to read a CSV file into the portal

    hi all,
    I want to read the content of CSV file that is avaliable in my desktop.
    plz help me with suitable code
    Regards
    Savitha
    Edited by: Savitha S R on Jun 1, 2009 8:25 AM

    Please use this code for that
    REPORT  znkp_upload_csv line-size 400.
    DATA : v_filename TYPE string.
    PARAMETER : p_file LIKE rlgrap-filename DEFAULT 'C:\Documents and Settings\Administrator\Desktop\JV.csv' .
    *Types for Reading CSV file
    TYPES : BEGIN OF x_csv,
              text(400) TYPE c,
            END OF x_csv.
    *Global internal table for reading CSV file
    DATA : lt_csv TYPE TABLE OF x_csv.
    *Global work area for reading CSV file
    DATA : wa_csv LIKE LINE OF lt_csv.
    v_filename = p_file.
    CALL FUNCTION 'GUI_UPLOAD'
      EXPORTING
        filename                      = v_filename
      FILETYPE                      = 'ASC'
      HAS_FIELD_SEPARATOR           = ' '
      HEADER_LENGTH                 = 0
      READ_BY_LINE                  = 'X'
      DAT_MODE                      = ' '
      CODEPAGE                      = ' '
      IGNORE_CERR                   = ABAP_TRUE
      REPLACEMENT                   = '#'
      CHECK_BOM                     = ' '
      VIRUS_SCAN_PROFILE            =
      NO_AUTH_CHECK                 = ' '
    IMPORTING
      FILELENGTH                    =
      HEADER                        =
      TABLES
        data_tab                      = lt_csv
    EXCEPTIONS
       file_open_error               = 1
       file_read_error               = 2
       no_batch                      = 3
       gui_refuse_filetransfer       = 4
       invalid_type                  = 5
       no_authority                  = 6
       unknown_error                 = 7
       bad_data_format               = 8
       header_not_allowed            = 9
       separator_not_allowed         = 10
       header_too_long               = 11
       unknown_dp_error              = 12
       access_denied                 = 13
       dp_out_of_memory              = 14
       disk_full                     = 15
       dp_timeout                    = 16
       OTHERS                        = 17
    IF sy-subrc <> 0.
      MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
              WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
    ENDIF.
    DATA : BEGIN OF item OCCURS 0 ,
            t1(20) TYPE c,
            t2(20) TYPE c,
            t3(20) TYPE c,
            t4(20) TYPE c,
            t5(20) TYPE c,
            t6(20) TYPE c,
            t7(20) TYPE c,
            t8(20) TYPE c,
           END OF item.
    DATA : txt(1) TYPE c. " 1-Header 2-Item
    LOOP AT lt_csv into wa_csv.
    split wa_csv-text at ',' into item-t1
                                  item-t2
                                  item-t3
                                  item-t4
                                  item-t5
                                  item-t6
                                  item-t7
                                  item-t8.
    append item.
    clear item.
    ENDLOOP.
    Check ITEM TABLE
    Regards
    Naresh

  • Loading a CSV file and accessing the variables

    Hi guys,
    I'm new to AS3 and dealt with AS2 before (just getting the grasp when the change it).
    Is it possible in AS3 to load an excel .csv file into Flash using the URLLoader (or ???) and the data as variables?
    I can get the .csv to load and trace the values (cell1,cell2,cell3....) but I'm not sure how to collect the data and place it into variables.
    Can I just create an array and access it like so.... myArray[0], myArray[1]? If so, I'm not sure why it's not working.
    I must be on the completely wrong path. Here's what I have so far....
    var loader:URLLoader = new URLLoader();
    loader.dataFormat = URLLoaderDataFormat.VARIABLES;
    loader.addEventListener(Event.COMPLETE, dataLoaded);
    var request:URLRequest = new URLRequest("population.csv");
    loader.load(request);
    function dataLoaded(evt:Event):void {
        var myData:Array = new Array(loader.data);
        trace(myData[i]);
    Thanks for any help,
    Sky

    just load your csv file and use the flash string methods to allocate those values to an array:
    var myDate:Array = loader.data.split(",");

Maybe you are looking for

  • Deleted location in purchase order

    Hi, i have set the deletion indicator to the location XXXX for XX material in MM06 T.code . My  requirement is how to give the error message if the user mention that deletion location during t purchase order order creation ? i have tried but even the

  • WBS field not appearing in the down payment document posted via FBA7/F-48

    i have created several down payment docs wrt PO, acct assignment WBS, but none seems to populate the WBSE field in the payment document, when using FBA7/F-48. All the down payment documents appear in the PO (Purchase order history) but only those whi

  • Rebate Agreement  TO Credit Memo Request No.

    Hi All, Here's the scenario, when i'm creating a Credit Memo No. in VB(7, a CM number will appear. My problem is, how can i get the BILLING DATE (SOURCE CODE) when displaying in VA03? what userexit or routine should I be using? Is there any way to fi

  • Accounting document through MR21

    Dear Experts, My user just performed price changed through MR21. My question is, why no Accounting document created after executed. There was only 'price change document appears after checked through CKMPCD. Please advice. Thank you in advance.

  • Are there features in PSE 10 that would be helpful for...

    I want to be able to more easily edit newborn images, specifically softening their skin and making their ruddy skin more creamy. Are there features in PSE10 which would make this easier, or does PSE 7 have the features I need? Also, are there seminar