Recording and Streaming?

I'd like to record some DJ sessions and live concerts. Can you run a line in into the iPad (from the output of a mixer)? So far the apps I've seen only record from the built in mic. Also is there an app that let's you stream recordings live or at all for that matter?

Ok... This may work for you. You need these three components:
Griffin iMic USB Audio Interface
http://store.apple.com/us/product/TX532ZM/A?mco=MTY3ODQ5OTY
Apple iPad Camera Connection Kit
http://store.apple.com/us/product/MC531ZM/A?fnode=MTc0MjU4NjE&mco=MTc0Njk4NzU
Audio Cable
1/8 in./3.5mm
This is how I set it up to test if this would work, it does. Connect the USB Camera connecter to the iPad. Plug the Griffin iMic USB Audio Interface into the USB Camera Connecter. Plug a 1/8 audio into the Griffin iMic USB Audio Interface and the other end into an iPad nano... I used Memo Recorder free from the app store to record the audio..... Works no problem.
You might have to use a 1/8 audio with RCA on the other end, depending on the output connecter that is on the mixer.    

Similar Messages

  • Record and Streaming playback

    Hi guys I am a newbie to FMS, got excited about it when I heard that Amazon cloud services are now including FMS for a marginal increase in cost.
    So all I am looking to do is:
    1) Record off a web cam and store the video using FMS
    2) Play the video stream using FMS.
    If someone could point the way to some sample code, then I will be off an running.
    Thanks a bunch.

    Go through FMS documentation - it should have simple code snippets for you to get started off.

  • JFM recording and streaming at the same time

    Hello Everyone,
    I was hoping someone could help me with using a webcam device for sending and recording at the same time.
    At this moment I'm only able to record. I tried to make another processor, and put the input of the streaming processor
    to the one that is recording, but problem is the streaming processor outputs not a streaming format. If I could stream
    with RTP I could embed a player in a website and view from there, while recording at the same time.
    Could somebody explain to me what do to here, or provide me a snippet of code, I really need this to work.
    I created this code, but this only records.
    import java.io.IOException;
    import javax.media.*;
    import javax.media.format.*;
    import javax.media.protocol.*;
    public class Record {
         public Processor createProcessor(){
              Format formats[] = new Format[2];
              formats[0] = new AudioFormat(AudioFormat.IMA4);//IMA4
              formats[1] = new VideoFormat(VideoFormat.CINEPAK);//CINEPAK
              FileTypeDescriptor outputType = new FileTypeDescriptor(FileTypeDescriptor.QUICKTIME);
              Processor p = null;
              try {
                   p = Manager.createRealizedProcessor(new ProcessorModel(formats,outputType));
                   } catch (IOException e) {
                        System.out.println("Error! 1");
                   System.exit(-1);
                   } catch (NoProcessorException e) {
                        System.out.println("Error! 2");
                   System.exit(-1);
                   } catch (CannotRealizeException e) {
                        System.out.println("Error! 3");
                   System.exit(-1);
              return p;
         public static void main(String [] args) {
              Record record = new Record();
              Processor p = record.createProcessor();
              DataSink sink;
              MediaLocator dest = new MediaLocator("file://newfile.mpeg");
              try{
                   sink = Manager.createDataSink(p.getDataOutput(), dest);
                   sink.open();
                   p.start();
                   sink.start();
                   try {
                   Thread.currentThread().sleep(6000);
                   } catch (InterruptedException ie) {
                   p.stop();
                   p.close();
                   Thread.currentThread().sleep(6000);
                   sink.close();
              } catch (Exception exception) {
                   System.out.println("error!");
              System.out.println("finished");
    Kind regards,
    Bluesboy89

    Take a look at the following example. It demonstrates how to do 2 things with the same input (namely view and record)
    [http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/JVidCap.html]

  • Recording a stream and using it on the fly

    Hi,
    I'm trying to upload a video stream from a flash client and
    convert the video stream on the fly (from an external converter
    like ffmpeg or vlc).
    Since I don't manage to send the stream directly to the
    converter, a solution I have found is to publish the sream as
    "record", and have a converter read the file to perform the
    operation.
    However, when setting the record method for publishing, the
    recorded file is not written continuously. Incoming stream is
    buffered, written, buffered, written and so on. So, the program
    reading the file stops it reaches the end.
    Is there a way in FMS2 to either:
    - forward incoming RTMP stream to a stream converter (using
    an outgoing RTSP stream or something like that) ?
    - force FMS2 to dump the stream continuously, as soon as it
    receives the incoming data ?
    Thanks for any help,
    Greg

    Stream.syncWrite
    Availability Flash Media Server 2.
    Usage myStream.syncWrite
    Description
    Property; a Boolean value that controls when a stream writes
    the contents of the buffer to a
    FLV file when the stream is recording. When syncWrite is
    true, all the messages that pass
    through the stream are flushed to the FLV file immediately.
    It is highly recommended that
    user should only set syncWrite to true in a stream that
    contains only data. Synchronization
    problems might occur when syncWrite is set to true in a
    stream that contains data and
    audio, video, or both.
    ... sounds like a problem is occuring :)

  • How do I make a quality recording of streaming music (and get into iTunes)?

    I'd like to make a recording of some streaming music.  I called Apple Care and the agent suggested using Quick Time.  Yes, I can make an audio recording, but it included a lot of background noise. (I could even hear myself tapping on my computer when I listened to the playback.)  Next, I tried using Soundflower because in the "Internal Audio"section of this page, I got the impression that I wouldn't be using my "live" mic on my MacBook Air, but it did.  Again, I could hear background noise.
    Is there a way to make a better recording of streamed music from the internet...without being a technical wiz?  I am not.  Thanks for your help!

    If you do a general web search you will find multiple answers.  iTunes does not do what you want to do, nor any Apple software as far as I am aware.  Possibly what you are doing is strictly speaking skirting legality (if the stream producer doesn't simply make a simple way to download) which makes providing  an answer on Apple's forum (where according to terms of use we have to remain legal) complicated.

  • How to publish and record video stream in h.264 format

    I'm a fresher. I want to publish and record video stream in h.264 and audio stream in AAC format .
    Does flash player support to publish video stream in h.264 format ?
    And which version of flash player and FMS could implement this function ?
    Thank you very much.

    The Flashplayer does not support h264 or AAC publishing. The Flashplayer supports only h.263 (sorenson spark) video, and Nellymoser Asao and Speex audio.
    If you want to publish h.264 you'll need to use Flash Media Live Encoder, or a third party encoder.

  • Time shifting and record live stream

    Hellow everyone,
    I would like to stream videos for my viewers and them have a abiliti to timeshifting live stream.
    Can I use live application for timeshifting? Or I should use livepkgr to do that.
    For archive and play on the VOD mode I like to stream live and record at same time. in the livepkgr I see in the stream folder create a stream file with the f4f extention automaticaly whoever in the live application I should call application.record on the FMS to save live stream on the server.
    Please advise me the better solution to archive and time shifting.
    Many Thanks.

    livepkgr application for for HTTP streaming...
    f youjust want to stream over rtmp use live application and code to record the stream...recording is pretty simple in AMS just call Stream.get() s.record(...) s.play() where s is the stream which you are recording...
    application.onPublish = function(clientObj, streamObj) {      trace("recording started...");       var strName = "recorded_" + streamObj.name;      var s = Stream.get(strName);      if (s == undefined )           return;         s.onStatus = function(info)      {           trace(info.code);      }           if (!s.record("record"))      {           s.trace("record failed.");      }            s.play(streamObj.name, -1, -1, true); }

  • How can I publish and record video stream in h.264 format

    I'm a fresher. I want to publish and record video stream in h.264 and audio stream in AAC format .
    Does flash player support to publish video stream in h.264 format ?
    And which version of flash player and FMS could implement this function ?
    Thank you very much.

    The Flashplayer does not support h264 or AAC publishing. The Flashplayer supports only h.263 (sorenson spark) video, and Nellymoser Asao and Speex audio.
    If you want to publish h.264 you'll need to use Flash Media Live Encoder, or a third party encoder.

  • RTP stream - record and listen simultaneously ...

    Hi!
    I would like to record and listen RTP stream simultaneously, but I don't know how... I can record to a file OR listen in headphone but just separately.
    If i create a datasink for recording on a port I cannot use that port to listen. But in this case how can I listen the audio stream on headphone?
    Maybe the good solution would be for me if I could clone the stream but I cannot do, because I get an exception
    javax.media.NoProcessorException: Cannot find a Processor for: com.ibm.media.protocol.CloneablePushBufferDataSource@1d8957fThis is the code:
    DataSource ds = Manager.createCloneableDataSource(Manager.createDataSource(media_locator));
    processor = Manager.createRealizedProcessor(new ProcessorModel(ds, formats, outputType));Has anybody got any idea?
    Thanks!
    Ric Flair

    I'd point you in this direction initially.
    [http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/Clone.java]

  • Audio stream - record and listen simultaneously

    Hi!
    I would like to record and listen RTP stream simultaneously, but I don't know how... I can record to a file OR listen in headphone but just separately.
    If i create a datasink for recording on a port I cannot use that port to listen. But in this case how can I listen the audio stream on headphone?
    Maybe the good solution would be for me if I could clone the stream but I cannot do, because I get an exception
    javax.media.NoProcessorException: Cannot find a Processor for: com.ibm.media.protocol.CloneablePushBufferDataSource@1d8957fThis is the code:
    DataSource ds = Manager.createCloneableDataSource(Manager.createDataSource(media_locator));
    processor = Manager.createRealizedProcessor(new ProcessorModel(ds, formats, outputType));Has anybody got any idea?
    Thanks!
    Ric Flair

    rflair wrote:
    Thank You, it was not a big help for me...That's too bad. But considering that JavaSound doesn't handle RTP, and all of the code you've posted in JMF code, that doesn't change the fact that it's a JMF question...

  • Recording live streams and streaming them through VOD service

    We have a licenced FMS interactive 4.5. Currently we have a one time per day Live stream event which is streamed by FMLE to livepkgr application. As far as I understand the event is recorded on the server. We want to share that record on FMS vod service. Sadly I cannot find any real information how to do this? Do we have to code our custom Flash application for FMS for that, like livepkgr2 ? How to get the record and copy the file exactly the moment the record finished? Maybe somebody already did this?

    you can creat an app folder in fms folder which is inside of
    ur setup folder,
    for details u can mail at [email protected]

  • OTA HDTV Recorded, Converted, and Streamed to ATV ... Results!!!

    Well, I just tried HD streaming and it worked flawlessly ... although it takes a bit of effort. Here is how I did it ...
    1. I recorded an OTA 720p HDTV broadcast using my El Gato Hybrid;
    2. I exported the program from EyeTV 2 as a Quicktime 1280x720 HD file;
    3. I then opened the exported QT file in Quicktime Pro and exported again again using "Movie to ATV" (I had to do this because the original QT file would not stream);
    4. I dropped the converted file in to iTunes, and streamed it to the Apple TV playing on my 44" 720P HDTV.
    Final Streamable File Specs: 960 x 540 ... I am not sure why QT Pro converted the original 1280x720 file to the lower res, but it did and the file streamed and looked GREAT.
    Results: I must say ... IT LOOKED SPECTACULAR. I am a bit surprised given the low-end specs of the ATV. No dropped frames, no glitches, no hiccups. If they offer HD movies on iTunes, they are going to look GREAT!
    The Downside: It took FOREVER to convert the 4 minute HD QT file using QTPro.

    Just an FYI ... the quality fell somewhere between OTA HDTV quality and DVD quality. I would say just about the same as cable or sat HDTV quality.

  • Custom metada and record live stream

    I tried to inject my own metadata into a live stream (webcam) recorded on FMS 4.5.1
    with no success. I was able to change only the standard metadata but not all my custom ones were recorded
    despite of all example on adobe livedocs, no luck.(as2 and as3)
    this example
    http://help.adobe.com/en_US/flashmediaserver/devguide/WS5b3ccc516d4fbf351e63e3d11a0773d56e -7ff6Dev.html
    doesn't work at all for custom metadat and EVEN recroded special metadata like createdby and creationdata
    example:
    metaData.createdby = String(streamTitle + " of "+fullName);
    metaData.creationdate = String(now.toString());
    metaData.tags = {createdby:metaData.createdby, creationdate:metaData.creationdate};
    metaData.width = Number(devicesQuality.width);
    metaData.height = Number(devicesQuality.height);
    nodeSendStream.send("@setDataFrame", "onMetaData", metaData);
    once the stream is recorded I can see only metData.width and height. createdby, creationdate and tags don't exist.
    notice: I encode and stream from FP11.1 in H2564 with speex or pcmu
    Thanks

    well,
    I exactly respected the example of the link you gave by modifying the Application.xml, but if it was a segment problem
    how can you explain that the NON custom metadas are successfully modified ?
    please check this link
    http://help.adobe.com/en_US/flashmediaserver/devguide/WS5b3ccc516d4fbf351e63e3d11a0773d56e -7ff6Dev.html
    I even tried the official property called createdby and creationdate without success. I was able to change only  width and height
    the code I use from the client is exactly the example of adobe manual, in AS2 and AS3
    Thanks

  • What is the fastest way to record and write image data?

    Hello,
    I am new to labview and am using the IMAQ software package to record and write data from a CCD camera to the hard drive. The end goal is a program that records and saves as much data as possible as quickly as possible, for experiments lasting on the scale of days. I have been experimenting with the snap, grab, and sequence methods of recording. To save the image data I was told by NI customer support that TDMS streaming would be the fastest method. I have also been experimenting with the TDMS VIs, however I have found the IMAQ Write File 2 VI to be faster in practice. Am I doing something wrong with the TDMS file structure? Is there a more efficient way to convert IMAQ image data to a dataform that can be written as TDMS? I am posting two of my programs to provide a clearer example of what I am trying to do. The first takes a snap of the image and appends it to a TDMS file. The second is the fastest I have found so far and uses Grab to record the images and the IMAQ Write File VI to save them. 
    Thanks
    Attachments:
    Camera Capture (Snap) and stream TDMS.vi ‏24 KB
    Camera Capture (Grab) and write image.vi ‏24 KB

    Hi
    For me it is no surprise that the second VI is faster then the first one, the reason is you can not compare this two VI with each other.
    In the first VI you work with TDMS files, in the second with png files. That would be much faster, because TDMS files need a lot of diskspace.
    Second point, why do you open, write and close for eacht iteration of the for loop a TDMS file in your first VI? That need really a lot of resources, so it coudn´t be fats as well.
    Save your converted pictures in one array, an after the acquisition you can save it in one TDMS file. When you need for each picture a TDMS file you have to know that this need some time to do this.
    So now for the architecture of your first VI, please look to the LL Sequence.vifrom the examples (C:\Program Files\National Instruments\LabVIEW 8.6\examples\IMAQ\IMAQ Low Level.llb\LL Sequence.vi), there you can see how an acquisition of a number of images have been to do.
    When you like it really fast, you make a producer/consumer architecture (see examples in LabVIEW for that).
    So in the first whileloop you acquire the images, write into a queue and in the second, parallel whileloop you read the queue and save the images. (see attachment)
    Hope that helps
    Kind regards
    Beat
    NI Germany
    Attachments:
    ProducerConsumerData.vi ‏10 KB
    ProducerConsumerData_LV8.5.vi ‏12 KB

  • Multiple bitrate encoder creates 3 seperate events and streams instead of a single event.

    I'm using the Niagara 4100 streaming appliance to encode a live multiple bitrate streams for an event. The problem is the Niagara encoder doesn't allow me to attach an event name to the stream url like - 'streamevent%i?adbe-live-event=liveevent', instead it only allows 'streamevent%i'. So in the end I get three seperate events with a stream for each event.
    Like so:
    Example:
    /Events/streamevent1/MNMMNNMxxxxx.stream 
    /Events/streamevent2/MNMMNNMxxxxx.stream
    /Events/streamevent3/MNMMNNMxxxxx.stream
    AND
    /Streams/streamevent1/files…
    /Streams/streamevent2/files…
    /Streams/streamevent3/files…
    Is it possible to make it work this way without using the 'adbe-live-event' parameter using the .f4m manifest file?
    Thanks,
    Dave

    There is a workaroud available but you can try using Flash Media Live Encoder, which is a free tool for better results. Surf to - C:\FMSHOME\samples\applications\livepkgr directory and you will find a main.asc file in there. This is the server side scripting file that is attached to the livepkgr application.
    In this, edit the var liveEventName variable and set it to the event you want to attach your streams to eg - "liveevent".
    Now create a separate copy of your livepkgr application inside C:\FMSHOME\applications\ like livepkgr_mbr and replace the main.far file with the main.asc you just edited. Now while publishing, connect to this application livepkgr_mbr and just specify the stream name - livestream%i. Check your permissions before starting the recording.
    This should work for you. But please remember that doing this will cause all your streams to be associated with the same event - liveevent as you are hard coding that inside the main.asc file. So its best to keep a separate application just for this purpose.

Maybe you are looking for