Issue in trasfering records to file using Transfer (in Binary mode)

Hi all,
I am using a trasfer statement to write the contents of internal table onto a file in Apllication server.
My  code is
   LOOP AT i_file.
       TRANSFER  i_file-rec_data TO i_pf_names-v_name.
   ENDLOOP
I am opening the file in binary mode.Its not writing the second record totally. Each of my record is 432 characters in length. Irrespective of  no. of records in my internal table its writing only 1 rec completely and half of second record.
I am aware that opening a file in binary mode ignores the line breaks.
So, opening in binary mode should write 1000 characters. Its writing 511 characters and stopping. Its not writing further records though my internal table is having records.
Can you please help.

Hi,
If you are viewing the application server file in AL11, you wont be able to view the total length of the file even though it is written. To confirm this use the transaction CG3Y and download the application server to presentation server and check if the whole of the record is written
Regards,
Vikranth

Similar Messages

  • How to record .jar file using Java Protocol

    Hi,
    I Tried to record .jar using Java protocol by using HR Loadrunner 11.0 Version tool.
    Am unable to record the application.
    Can any one suggest me the process of recording .jar file using load runner?
    Thanks,
    Venkat

    Look at the manual page for jar:
    # man jar
    Also you can run them by doing:
    # java -jar Prog.jar

  • Read an avi file using "Read from binary file" vi

    My question is how to read an avi file using "Read from binary file" vi .
    My objective is to create a series of small avi files by using IMAQ AVI write frame with mpeg-4 codec of 2 second long (so 40 frames in each file with 20 fps ) and then send them one by one so as to create a stream of video. The image are grabbed from USB camera. If I read those frames using IMAQ AVI read frame then compression advantage would be lost so I want to read the whole file itself.
    I read the avi file using "Read from binary file" with unsigned 8 bit data format and then sent to remote end and save it and then display it, however it didnt work. I later found that if I read an image file using "Read from binary file" with unsigned 8 bit data format and save it in local computer itself , the format would be changed and it would be unrecognizable. Am I doing wrong by reading the file in unsined 8 bit integer format or should I have used any other data types.
    I am using Labview 8.5 and Labview vision development module and vision acquisition module 8.5
    Your help would be highly appreciated.
    Thank you.
    Solved!
    Go to Solution.
    Attachments:
    read avi file in other data format.JPG ‏26 KB

    Hello,
    Check out the (full) help message for "write to binary file"
    The "prepend array or string size" input defaults to true, so in your example the data written to the file will have array size information added at the beginning and your output file will be (four bytes) longer than your input file. Wire a False constant to "prepend array or string size" to prevent this happening.
    Rod.
    Message Edited by Rod on 10-14-2008 02:43 PM

  • Issue while loading a csv file using sql*loader...

    Hi,
    I am loading a csv file using sql*loader.
    On the number columns where there is data populated in them, decimal number/integers, the row errors out on the error -
    ORA-01722: invalid number
    I tried checking the value picking from the excel,
    and found the chr(13),chr(32),chr(10) values characters on the value.
    ex: select length('0.21') from dual is giving a value of 7.
    When i checked each character as
    select ascii(substr('0.21',5,1) from dual is returning a value 9...etc.
    I tried the following command....
    "to_number(trim(replace(replace(replace(replace(:std_cost_price_scala,chr(9),''),chr(32),''),chr(13),''),chr(10),'')))",
    to remove all the non-number special characters. But still facing the error.
    Please let me know, any solution for this error.
    Thanks in advance.
    Kiran

    control file:
    OPTIONS (ROWS=1, ERRORS=10000)
    LOAD DATA
    CHARACTERSET WE8ISO8859P1
    INFILE '$Xx_TOP/bin/ITEMS.csv'
    APPEND INTO TABLE XXINF.ITEMS_STAGE
    FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' TRAILING NULLCOLS
    ItemNum                    "trim(replace(replace(:ItemNum,chr(9),''),chr(13),''))",
    cross_ref_old_item_num               "trim(replace(replace(:cross_ref_old_item_num,chr(9),''),chr(13),''))",
    Mas_description               "trim(replace(replace(:Mas_description,chr(9),''),chr(13),''))",
    Mas_long_description               "trim(replace(replace(:Mas_long_description,chr(9),''),chr(13),''))",
    Org_description               "trim(replace(replace(:Org_description,chr(9),''),chr(13),''))",
    Org_long_description               "trim(replace(replace(:Org_long_description,chr(9),''),chr(13),''))",
    user_item_type                    "trim(replace(replace(:user_item_type,chr(9),''),chr(13),''))",
    organization_code               "trim(replace(replace(:organization_code,chr(9),''),chr(13),''))",
    primary_uom_code               "trim(replace(replace(:primary_uom_code,chr(9),''),chr(13),''))",
    inv_default_item_status          "trim(replace(replace(:inv_default_item_status,chr(9),''),chr(13),''))",
    inventory_item_flag               "trim(replace(replace(:inventory_item_flag,chr(9),''),chr(13),''))",
    stock_enabled_flag               "trim(replace(replace(:stock_enabled_flag,chr(9),''),chr(13),''))",
    mtl_transactions_enabled_flag          "trim(replace(replace(:mtl_transactions_enabled_flag,chr(9),''),chr(13),''))",
    revision_qty_control_code          "trim(replace(replace(:revision_qty_control_code,chr(9),''),chr(13),''))",
    reservable_type               "trim(replace(replace(:reservable_type,chr(9),''),chr(13),''))",
    check_shortages_flag               "trim(replace(replace(:check_shortages_flag,chr(9),''),chr(13),''))",
    shelf_life_code               "trim(replace(replace(replace(replace(:shelf_life_code,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    shelf_life_days               "trim(replace(replace(replace(replace(:shelf_life_days,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    lot_control_code               "trim(replace(replace(:lot_control_code,chr(9),''),chr(13),''))",
    auto_lot_alpha_prefix               "trim(replace(replace(:auto_lot_alpha_prefix,chr(9),''),chr(13),''))",
    start_auto_lot_number               "trim(replace(replace(:start_auto_lot_number,chr(9),''),chr(13),''))",
    negative_measurement_error          "trim(replace(replace(replace(replace(:negative_measurement_error,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    positive_measurement_error          "trim(replace(replace(replace(replace(:positive_measurement_error,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    serial_number_control_code          "trim(replace(replace(:serial_number_control_code,chr(9),''),chr(13),''))",
    auto_serial_alpha_prefix          "trim(replace(replace(:auto_serial_alpha_prefix,chr(9),''),chr(13),''))",
    start_auto_serial_number          "trim(replace(replace(:start_auto_serial_number,chr(9),''),chr(13),''))",
    location_control_code               "trim(replace(replace(:location_control_code,chr(9),''),chr(13),''))",
    restrict_subinventories_code          "trim(replace(replace(:restrict_subinventories_code,chr(9),''),chr(13),''))",
    restrict_locators_code               "trim(replace(replace(:restrict_locators_code,chr(9),''),chr(13),''))",
    bom_enabled_flag               "trim(replace(replace(:bom_enabled_flag,chr(9),''),chr(13),''))",
    costing_enabled_flag               "trim(replace(replace(:costing_enabled_flag,chr(9),''),chr(13),''))",
    inventory_asset_flag               "trim(replace(replace(:inventory_asset_flag,chr(9),''),chr(13),''))",
    default_include_in_rollup_flag          "trim(replace(replace(:default_include_in_rollup_flag,chr(9),''),chr(13),''))",
    cost_of_goods_sold_account          "trim(replace(replace(:cost_of_goods_sold_account,chr(9),''),chr(13),''))",
    std_lot_size                    "trim(replace(replace(replace(replace(:std_lot_size,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    sales_account                    "trim(replace(replace(:sales_account,chr(9),''),chr(13),''))",
    purchasing_item_flag               "trim(replace(replace(:purchasing_item_flag,chr(9),''),chr(13),''))",
    purchasing_enabled_flag          "trim(replace(replace(:purchasing_enabled_flag,chr(9),''),chr(13),''))",
    must_use_approved_vendor_flag          "trim(replace(replace(:must_use_approved_vendor_flag,chr(9),''),chr(13),''))",
    allow_item_desc_update_flag          "trim(replace(replace(:allow_item_desc_update_flag,chr(9),''),chr(13),''))",
    rfq_required_flag               "trim(replace(replace(:rfq_required_flag,chr(9),''),chr(13),''))",
    buyer_name                    "trim(replace(replace(:buyer_name,chr(9),''),chr(13),''))",
    list_price_per_unit               "trim(replace(replace(replace(replace(:list_price_per_unit,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    taxable_flag                    "trim(replace(replace(:taxable_flag,chr(9),''),chr(13),''))",
    purchasing_tax_code               "trim(replace(replace(:purchasing_tax_code,chr(9),''),chr(13),''))",
    receipt_required_flag               "trim(replace(replace(:receipt_required_flag,chr(9),''),chr(13),''))",
    inspection_required_flag          "trim(replace(replace(:inspection_required_flag,chr(9),''),chr(13),''))",
    price_tolerance_percent          "trim(replace(replace(replace(replace(:price_tolerance_percent,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    expense_account               "trim(replace(replace(:expense_account,chr(9),''),chr(13),''))",
    allow_substitute_receipts_flag          "trim(replace(replace(:allow_substitute_receipts_flag,chr(9),''),chr(13),''))",
    allow_unordered_receipts_flag          "trim(replace(replace(:allow_unordered_receipts_flag,chr(9),''),chr(13),''))",
    receiving_routing_code               "trim(replace(replace(:receiving_routing_code,chr(9),''),chr(13),''))",
    inventory_planning_code          "trim(replace(replace(:inventory_planning_code,chr(9),''),chr(13),''))",
    min_minmax_quantity               "trim(replace(replace(replace(replace(:min_minmax_quantity,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    max_minmax_quantity               "trim(replace(replace(replace(replace(:max_minmax_quantity,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    planning_make_buy_code               "trim(replace(replace(:planning_make_buy_code,chr(9),''),chr(13),''))",
    source_type                    "trim(replace(replace(:source_type,chr(9),''),chr(13),''))",
    mrp_safety_stock_code               "trim(replace(replace(:mrp_safety_stock_code,chr(9),''),chr(13),''))",
    material_cost                    "trim(replace(replace(:material_cost,chr(9),''),chr(13),''))",
    mrp_planning_code               "trim(replace(replace(:mrp_planning_code,chr(9),''),chr(13),''))",
    customer_order_enabled_flag          "trim(replace(replace(:customer_order_enabled_flag,chr(9),''),chr(13),''))",
    customer_order_flag               "trim(replace(replace(:customer_order_flag,chr(9),''),chr(13),''))",
    shippable_item_flag               "trim(replace(replace(:shippable_item_flag,chr(9),''),chr(13),''))",
    internal_order_flag               "trim(replace(replace(:internal_order_flag,chr(9),''),chr(13),''))",
    internal_order_enabled_flag          "trim(replace(replace(:internal_order_enabled_flag,chr(9),''),chr(13),''))",
    invoice_enabled_flag               "trim(replace(replace(:invoice_enabled_flag,chr(9),''),chr(13),''))",
    invoiceable_item_flag               "trim(replace(replace(:invoiceable_item_flag,chr(9),''),chr(13),''))",
    cross_ref_ean_code               "trim(replace(replace(:cross_ref_ean_code,chr(9),''),chr(13),''))",
    category_set_intrastat               "trim(replace(replace(:category_set_intrastat,chr(9),''),chr(13),''))",
    CustomCode                    "trim(replace(replace(:CustomCode,chr(9),''),chr(13),''))",
    net_weight                    "trim(replace(replace(replace(replace(:net_weight,chr(9),''),chr(13),''),chr(32),''),chr(10),''))",
    production_speed               "trim(replace(replace(:production_speed,chr(9),''),chr(13),''))",
    LABEL                         "trim(replace(replace(:LABEL,chr(9),''),chr(13),''))",
    comment1_org_level               "trim(replace(replace(:comment1_org_level,chr(9),''),chr(13),''))",
    comment2_org_level               "trim(replace(replace(:comment2_org_level,chr(9),''),chr(13),''))",
    std_cost_price_scala               "to_number(trim(replace(replace(replace(replace(:std_cost_price_scala,chr(9),''),chr(32),''),chr(13),''),chr(10),'')))",
    supply_type                    "trim(replace(replace(:supply_type,chr(9),''),chr(13),''))",
    subinventory_code               "trim(replace(replace(:subinventory_code,chr(9),''),chr(13),''))",
    preprocessing_lead_time          "trim(replace(replace(replace(replace(:preprocessing_lead_time,chr(9),''),chr(32),''),chr(13),''),chr(10),''))",
    processing_lead_time                "trim(replace(replace(replace(replace(:processing_lead_time,chr(9),''),chr(32),''),chr(13),''),chr(10),''))",
    wip_supply_locator               "trim(replace(replace(:wip_supply_locator,chr(9),''),chr(13),''))"
    Sample data from csv file.
    "9901-0001-35","390000","JMKL16 Pipe bend 16 mm","","JMKL16 Putkikaari 16 mm","","AI","FJE","Ea","","","","","","","","","","","","","","","","","","","","","","","","","21-21100-22200-00000-00000-00-00000-00000","0","21-11100-22110-00000-00000-00-00000-00000","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","","0.1","Pull","AFTER PROD","","","Locator for Production"
    The load errors out on especially two columns :
    1) std_cost_price_scala
    2) list_price_per_unit
    both are number columns.
    And when there is data being provided on them. It errors out. But, if they are holding null values, the records go through fine.
    Message was edited by:
    KK28

  • Extract records from file using RegEx

    Hi guys,
    I was wondering if there's a way to extract records from a text file using regular expressions matching groups in Endeca Integrator.
    I have several XML files and I have mapped and extracted records from them using XML Reader component, but I need to preserve the tags of mapped elements. For example, if I have the following XML:
    <Msg><Para>Some text. <Emphasis>More text</Emphasis><Para></Msg>
    mapping Msg element to an output cloverField will return "Some text. More text". I need to preserve tags, so I need an output like: "<Msg><Para>Some text. <Emphasis>More text</Emphasis><Para></Msg>",
    I don't know if there's a way to accomplish this by modifying the setting of the XML Reader component, but I was thinking of using regular expressions to solve this. Unfortunately, I can't find a way to do it by using the built-in components in Integrator.
    The final solution is do it programmatically without using Integrator, but I want to know if I'm missing something.
    Thanks in advance.

    This will get the name and extension in separate variables
    set vTheFile to (choose file) as text
    set {TempTID, AppleScript's text item delimiters} to {AppleScript's text item delimiters, ":"}
    set vPath to (vTheFile's text items 1 thru -2 as text)
    -- Add this for the file name
    set vFile to vTheFile's text item -1
    -- This will get the name and extension as  seperate items
    set AppleScript's text item delimiters to "."
    set {vFileName, VfileNameExtension} to {text item 1 of vFile, text item 2 of vFile}
    set AppleScript's text item delimiters to TempTID
    display dialog "Path is: " & vPath & return & return & "File is: " & vFile & return & "File name is: " & vFileName & return & "Extension is : " & VfileNameExtension
    (but I think I am done now)

  • How to read file header/open filein binary mode

    i am using FileConnenction API to read from a file. how can i open a file in binary mode so that i can read file header. In other words i want to open a file in binary mode.

    You can always try looking at the [Header Signatures|http://www.techpathways.com/uploads/headersig.txt]. But it would probably be easier to adjust your application so it doesn't have to guess.
    ~

  • Process multi-record & multi-record-format files using ESB & File Adapter

    I am looking to process/parse an in-bound file with the following make-up and to load into a database table using the File Adapter, ESB and DB Adapter.
    File Make-Up:
    - each line in the file is a record
    - each record is made up of 12 fields
    - there are multiple record types denoted by the first field in the line
    - record types may or may not have common fields
    - where there are common fields, the field may be in different columns
    - each record is to be inserted into a database table
    Sample File:
    3,,"03-0243-0188132-00",.20,26,075,"","000000006026","","","22/04/08",03 1303
    3,,"03-0243-0188132-00",20.00,26,075,"","","","","22/04/08",03 0579
    5,,"03-0243-0188132-00",99.60,,,"OPENING BALA",,,"ACME GROUP","22/04/08",
    6,,"03-0243-0188132-00",99.60,,,"CLOSING BALA",,,"ACME GROUP","22/04/08",
    8,,"03-0243-0188132-00",-346119.05,16,000,"DEBITS",,,,"22/04/08",
    8,,"03-0243-0188132-00",346119.05,349,050,"CREDITS",,,,"22/04/08",
    9,,"03-0243-0188132-00",-346119.05,16,000,"DEBITS",,,,"22/04/08",
    9,,"03-0243-0188132-00",346119.05,349,050,"CREDITS",,,,"22/04/08",
    Record Types and corresponding format:
    3, Corp ID, A/C Number, Trans Amt, Serial Number, Trans Code, Particulars, Analysis Code, Reference, Other Party Name, Transaction Date, Originating Bank
    5, Corp ID, A/C Number, Opening Balance, Serial Number, Trans Code, Particulars, Analysis Code, Reference, Other Party Name, Transaction Date, Originating Bank
    6, Corp ID, A/C Number, Closing Balance, Serial Number, Trans Code, Particulars, Analysis Code, Reference, Other Party Name, Transaction Date, Originating Bank
    8, Corp ID, A/C Number, Amount, Number, 000, “DEBITS”, Analysis Code, Reference, Other Party Name, Transaction Date, Originating Bank
    8, Corp ID, A/C Number, Amount, Number, 050, “CREDITS”, Analysis Code, Reference, Other Party Name, Transaction Date, Originating Bank
    9, Corp ID, A/C Number, Amount, Number, 000, “DEBITS”, Analysis Code, Reference, Other Party Name, Transaction Date, Originating Bank
    9, Corp ID, A/C Number, Amount, Number, 050, “CREDITS”, Analysis Code, Reference, Other Party Name, Transaction Date, Originating Bank
    Please note that record types 8 and 9 have 2 fixed fields.
    I have tried playing around with the File Adapter and the ESB but not having much luck. Can someone provide any suggestions on how I can handle this?

    James,
    Thanks for your prompt response. I have come across your post previously.
    Please excuse my in-experience (very new to SOA and Oracle SOA Suite) but i have not understood how your post regarding the manual creation of an XSD will assist with my problem. Could you possibly further elaborate on the overall approach i should be taking?
    Regards,
    Simon

  • Issue with generating large pdf file using cfdocument tags in CF9

    We are in the process of upgrading our code to use cf9 and the cfdocument tag (from the old cfx_pdf tags).  We have successfully gotten one piece of our code to work but the file size of the pdf that we are generating now is huge in comparison to what it was using the CFX_PDF tags. (I.E.  with the new code the file is 885 KB in comparison to the old code generating only a 11KB file). We are not embedding fonts so the Fontembed = "no" didn't work for us.  We do have all of our images as .jpgs but unfortunately due to the volume of images that we have we can not switch all these files into another format.  Is there is way to shrink or optimize the pdf file size that we are generating? 
    Thanks so much for your help.
    Claudia

    We are in the process of upgrading our code to use cf9 and the cfdocument tag (from the old cfx_pdf tags).  We have successfully gotten one piece of our code to work but the file size of the pdf that we are generating now is huge in comparison to what it was using the CFX_PDF tags. (I.E.  with the new code the file is 885 KB in comparison to the old code generating only a 11KB file). We are not embedding fonts so the Fontembed = "no" didn't work for us.  We do have all of our images as .jpgs but unfortunately due to the volume of images that we have we can not switch all these files into another format.  Is there is way to shrink or optimize the pdf file size that we are generating? 
    Thanks so much for your help.
    Claudia

  • No sound recording avi files using Camileo S20

    Hi all
    I got the s20 on thursday and it has been a nightmare. Before the sound went i couldn't get the video files to work in Movie Edit Pro or sony vegas.
    It didn't even work in the Arcsoft software that came with it.
    I have tried it on three differant computers and it keeps showing up with no audio.
    I have tried all the codecs you can think of and none of them worked. Now i cant hear any sound even when i play it back on the S20.
    I dont really now what to do. I have two other hd camcorders and they work fine on my computer.
    I have had to re-format my computer as the codecs had messed it up. If i had my way i would send it back and be done with it but i have to wait until tomorrow before i can contact this number which i have been given.
    What i want to know is:
    Is there any way i can get the avi files to work in sony vegas and movie edit pro 15 plus?
    Any help or advise would be welcome.
    thanks
    regards
    Chrisc74

    >Is there any way i can get the avi files to work in sony vegas and movie edit pro 15
    As far as I know the S20 creates the h264 encoded videos.
    This means that you need the H264 codex in order to get video and sound on your notebook.
    I cannot say anything about the sony vegas or movie edit pro software because I dont use it.
    But I edited my videos using the common widows movie maker and everything was ok.

  • Read doc file using the Word Binary file format

    I want to be able to parse a Word file (xyz.doc).I want to be able to get to the actual text in the file and also get the "document properties" associated with this file.(This is the stuff you enter when you hit the File>>Properties button in Word)
    I have the binary file format of MS word 6.0 and I suppose these "document properties" are stored within a segment in this file format.
    Can anyone help me identify which segment in the Microsoft Binary word format this will be in???
    (Ya and ofcourse I want to achieve all the parsing purely in Java!!..)
    Thanks,
    Supriya

    IF you use MFC, yes. Don't ask how, this is the Java
    forum. As to the JAva API, no. There is no API in Java
    to deal with Microsoft docs, this is Java, not
    Micorsoft. Ya I know that!!:P
    You can write your own code, using the
    information in the doc. The info you refer to is
    locted on the header portion. As to how to read it,
    best go to a Microsoft forum. Once you understand the
    format, then post any Java related questions here,
    i.e. how do I oppen a file to read/write etc. Good
    luck, you will need patience, but you will leran a
    lot...Thanks anyway for your reply!:)..Will try with patience

  • Is it possible to store a back up on external HD by transfering files using Target Disk Mode?

    Dear Apple community,
    There's absolutely no way that I'm going to be able to fix my MBP without a clean install. I simply cannot log in and get to my desktop screen after upgrading to Mavericks. So I will have to erase everything and perform a clean install. However, before proceeding with this plan, I'll have to back up my files from my MPB since I cannot use my mac normally, I'll have to back up my files using the Target Disk Mode. I've connected my MPB to another MPB in the past, so I know how the process work. But this time around I won't be able to use another MBP but a MB. I have a few doubts regarding the process especially in what concernd adding an external HD to this equation. So please bear with me and see if you can answer my doubts.
    So this is the equipment I'll be working with: my MPB, my friends MacBook (2008) and my 1TB external hard drive.
    My basic question is: once I connect my MPB in Target Disk Mode to the MB, I'm sure I'd be able to transfer my files to this MB. But let's say that I don't want to leave my files in this MB (after all it's not my computer) and want to send my files straight to an external HD connected to this MB. Is it possible?  If positive, I know that the only FireWire port in the MB is a 400 port. That means my MBP will be connect to the MB using the only FireWire port available in the MB. So I believe that my external HD would have to be connected to the MB with the help of an USB cable. I assume that this will make the transferring process a little slower than if I could count solely on FireWire ports. Slowing the process down is not a problem but could this corrupt any of my files during the transferring process or even damage anything? I assume that would never happen after all it's just digital data, but better safe than sorry, right? What do you think?
    Another thought that occurred to me is: let's say I connect my MBP to the MB using the FireWire 400. I'd still have an FireWire 800 port available on my MBP. If I connect my external HD to my MBP (yeah, the one that will be in Target Disk Mode) using that FW 800 port, would my MBP allow this external HD to mount as if it was just another folder/drive in my computer? If positive, theoretically I'd be able to transfer files from my MBP to my external HD manually (by dragging files from folder to folders), right?
    I really don't want to transfer my files from the MBP to the MB and then from MB to external HD. That would just take time and I really don't know how much available space I'll find on my friend's MB.
    Oh, one last question: when transferring files using Target Disk Mode, do all computers involved in this process gotta have the same OS X version installed or it doesn't matter if e.g one is running Snow Leopard and the other Mavericks?
    Looking forward to hear back from you, experts.

    Grant Bennet-Alder West of Boston, USA 
    A new 1TB External drive (suitable to replace your current Internal drive if you choose to) and enclosure can be had for under US$160
    Ehh, you meant $60 ,  not $160
    CassHeger 
    I do need to buy an external HD to be used solely for that purpose.
    Yes, quality HD are cheap as dirt.
    however a likewise Toshiba Internal HD 1TB is $70   (same as used by Apple)
    http://www.ebay.com/itm/TOSHIBA-MQ01ABD100-1TB-5400-RPM-8MB-Cache-2-5-SATA-3-0Gb   -s-Internal-Notebook-/121107538930?pt=US_Internal_Hard_Disk_Drives&hash=item1c3 2 9263f2
    best options for the price, and high quality HD:
    Quality 1TB drives are $50 per TB on 3.5" or  $65 per TB on 2.5"
    Perfect 1TB for $68
    http://www.amazon.com/Toshiba-Canvio-Portable-Hard-Drive/dp/B005J7YA3W/ref=sr_1_ 1?ie=UTF8&qid=1379452568&sr=8-1&keywords=1tb+toshiba
    Nice 500gig for $50. ultraslim and perfect
    http://www.amazon.com/Toshiba-Canvio-Portable-External-Drive/dp/B009F1CXI2/ref=s r_1_1?s=electronics&ie=UTF8&qid=1377642728&sr=1-1&keywords=toshiba+slim+500gb
    2.5" USB portable High quality BEST FOR THE COST, Toshiba "tiny giant" 2TB drive (have several of them, LOT of storage in a SMALL package)    $117
    http://www.amazon.com/Toshiba-Canvio-Connect-Portable-HDTC720XK3C1/dp/B00CGUMS48 /ref=sr_1_4?s=electronics&ie=UTF8&qid=1379182740&sr=1-4&keywords=2tb+toshiba
    *This one is the BEST portable  external HD available that money can buy:
    HGST Touro Mobile 1TB USB 3.0 External Hard Drive  
    $88
    http://www.amazon.com/HGST-Mobile-Portable-External-0S03559/dp/B009GE6JI8/ref=sr _1_1?ie=UTF8&qid=1383238934&sr=8-1&keywords=HGST+Touro+Mobile+Pro+1TB+USB+3.0+72 00+RPM
    Most storage experts agree on the Hitachi 2.5"

  • Recording wav file

    HI is there any way to record wav file using Code composer studio to TI 5510 DSK and play them back. I am having problem writing a code to record in wav format. Plz help if there is any simple code to understand. Thank you

    akki20,
    I think you are looking for Texas Instruments. That is, unless you are somehow interfacing the 5510 DSK with LabVIEW or are using Measurement Studio for display / control. Let us know if that is the case.
    Regards,
    Ryan Verret
    Product Marketing Engineer
    Signal Generators
    National Instruments

  • Recording mp3 files

    How do I record MP3 files using ITunes? It wants to record as audio files which will take more than disk. Is there some other easy program which will be less cumbersome than ITunes. Must be freeware. Welcome suggestions. Thanks

    The link does not work. Here is the detail:
    Choose iTunes > Preferences, then click the Burning button at the top of the window.
    Choose MP3 CD as the Disc Format, then click OK.
    Select the playlist containing the songs you want to burn to the CD, and make sure all the songs you want to include on the CD have a checkmark beside them.
    You can only burn a CD from the songs in a playlist, and only checked songs are included. For instructions on creating a playlist, click "Tell me more."
    If the playlist contains more songs than will fit on one CD, iTunes will burn as many songs as fit on the CD. You may want to move some of the songs to a new playlist, and then burn each playlist to a separate CD. (You can see the size of the selected playlist at the bottom of the iTunes window.)
    Click the Burn Disc button in the upper-right corner of the iTunes window.
    Insert a blank CD-R disc and click Burn Disc again.
    If you plan to only play the CD using your computer, you can also use a CD-RW disc. To play it on a stereo, use a CD-R disc.
    If your playlist includes songs in formats other than MP3, they will not be burned on the CD.
    Regards

  • New-line Character for a file opened in BINARY mode for O/P

    Can anybody please tell me how to put a new-line charcter for each record in a file opened in Binary mode for O/p. I cant use text mode  for other reasons in my scenario.
    DATA: l_outdata TYPE xstring.
    OPEN DATASET file FOR OUTPUT IN BINARY MODE.
    Application logic to populate l_outdata
    TRANSFER l_outdata TO file.
    Any help, or attempt towards it will be duely rewarded,
    Thanks in advance,
    Sujit.

    hi sujit,
    1. how to put a new-line charcter for each record
    but how will you decide, at which POSITION,
    to put the new line character, if u have opened the file in binary  mode.
    2. However, if u know the exact positions(s),
      u can use  CL_ABAP_CHAR_UTILITIES=>CR_LF
      to insert the new line character,
    regards,
    amit m.

  • Data transfer from application server to sap system in binary mode

    hi friends,
    can any one help me out in
    how to read,open the dataset (syntax) while transfering the file from application server in binary mode ?
    how to move the file to differrent location in the same server?
    regards,
    mohan
    Message was edited by:
            mohan kumard

    HI,
    Syntax for reading the file
    OPEN DATASET file FOR INPUT IN BINARY MODE
    if sy-subrc = 0.
    DO.
          READ DATASET file INTO wa_record.
    ENDDO.
    CLOSE DATASET file.
    endif.
    syntax for writing file.
    OPEN DATASET  file FOR OUTPUT  IN  BINARY MODE .
    if sy-subrc = 0.
    transfer wa_record to file.
    CLOSE DATASET file.
    endif.
    You can also transfer file from application server to presentation server using transaction CG3Y by specifying source path(app ser),destination path(presentation ser) and tranfer format (BIN).
    To copy the file from one location to other location in application server
    use the FM 'ARCHIVFILE_SERVER_TO_SERVER'
    reward points if needful..

Maybe you are looking for