Trouble merging audio and video

Hi everybody. I'm using 'merge clips' to combine audio recorded separately from video. We also had a mic on the camera, so we can hear how it syncs up. The clip is an hour long. The audio starts out in sync, but gradually and steadily lags behind the video. The audio ends up being 1-2 frames behind every 30 seconds or so. Very annoying. Any idea why this would happen and how to fix?

wing hunter wrote: unknown audio recording device. I got them as wav files.
I would be hunting down the guy who recorded them, finding out what the unknown device is, and what its settings are/were at the time, as well as what they did before delivering the files. Were they imported from the device into some sort of software? File conversions? I know ProTools projects have selectable frame rates, and that needs to match your footage frame rate or you will have sync issues.
Lots of variables that need to be determined before we can help.

Similar Messages

  • How to demultiplex the merged audio and video streams

    Hi. I'm currently working on a video conferencing program. I successfully implemented the program using threads to run the audio and the video clients and servers. The problem with this is that the delay for the transmission of both media is a little too great. So now, I'm trying to merge both media into on DataSource, so then I don't have to worry about sending audio and video into 2 different ports and RTP sessions. I know how to merge the audio and video streams, but I was wondering how to extract each media back after receiving the merged audio and video streams. Any ideas? I used the createMergingDataSource(the_DataSource_array) method to merge both streams. Thanks.
    J.L.

    I am working on the same problem. Is there any solution yet?
    Regards
    C.Eckert

  • FCP7; "unmerge" merged audio and video clips in the timeline?

    Our TC was way off in some areas due to a bad timecode lemo on set. My editor synched by slate then merged the video with an audio mixdown track using the in points. Now when I export an EDL the original TC from the audio clip is gone, so he has no reference for matching up the audio. Does anybody know of a way to un-merge these clips and restore the original filename (audio takes the name of merged video but reveals correctly in finder) and timecode? Thanks.

    what happens if you match frame the audio?  I would think it would match back to the source audio clip which you could then use to replace the audio in the timeline.

  • Merge Audio And Video [Help]

    Dear, Guys ...
    I need some help..... my problems, when i m trying merge the video and audio, just the audio can show out......
    this is the code :
    private boolean mergeAudioVideo() {
         try {
    String tempTotal = myGrabber.tempFile + myGrabber.cntMovies + myGrabber.extension;
    String audioFile = "";
    System.out.println(myGrabber.movPath + tempTotal);
    audioFile = mySampler.audioFile.toURL().toString();
    String mergeArguments[] = {"-o", myGrabber.movFile,     myGrabber.movPath + tempTotal, audioFile};
    if (!myProgressBar.cancelled) {
    new Merge(mergeArguments, myProgressBar);
    return true;
    } else {
    return false;
         } catch (Exception e) {
    outWindow.out("Gagal menggabungkan file - file");
    outWindow.out("" + e);
    } catch (OutOfMemoryError o) {
    outWindow.out("Gagal menggabungkan file - file");
    outWindow.out("" + o);
    return true;
    merge.java
    package camcap.Recording;
    * @author Shinlei
    import java.io.File;
    import javax.media.*;
    import javax.media.format.*;
    import javax.media.protocol.*;
    import javax.media.protocol.DataSource;
    import javax.media.datasink.*;
    import java.util.Vector;
    * Menggabungkan jalur - jalur dari input yang berbeda dan menghasilkan sebuah file QuickTime
    * dengan jalur - jalur yang sudah digabung
    public class Merge implements ControllerListener, DataSinkListener {
    Vector sourcesURLs = new Vector(1);
    Processor [] processors = null;
    String outputFile = null;
    String videoEncoding = "JPEG";
    String audioEncoding = "LINEAR";
    String outputType = FileTypeDescriptor.QUICKTIME;
    DataSource [] dataOutputs = null;
    DataSource merger = null;
    DataSource outputDataSource;
    Processor outputProcessor;
    ProcessorModel outputPM;
    DataSink outputDataSink;
    MediaLocator outputLocator;
    boolean done = false;
    VideoFormat videoFormat = null;
    AudioFormat audioFormat = null;
    public camcap.UserInterface.EncodingProgressBar myProgressBar;
    public Merge(String [] args) {
         parseArgs(args);
         if (sourcesURLs.size() < 2) {
         System.err.println("Need at least two source URLs");
         showUsage();
         } else {
         doMerge();
    public Merge(String [] args, camcap.UserInterface.EncodingProgressBar p) {
    myProgressBar = p;
         parseArgs(args);
         if (sourcesURLs.size() < 2) {
         System.err.println("Need at least two source URLs");
         showUsage();
         } else {
         doMerge();
    private void doMerge() {
         int i = 0;
         processors = new Processor[sourcesURLs.size()];
         dataOutputs = new DataSource[sourcesURLs.size()];
         for (i = 0; i < sourcesURLs.size(); i++) {
         String source = (String) sourcesURLs.elementAt(i);
         MediaLocator ml = new MediaLocator(source);
         ProcessorModel pm = new MyPM(ml);
         try {
              processors[i] = Manager.createRealizedProcessor(pm);
              dataOutputs[i] = processors.getDataOutput();
              processors[i].start();
         } catch (Exception e) {
              System.err.println("Failed to create a processor: " + e);
              System.exit(-1);
         try {
         merger = Manager.createMergingDataSource(dataOutputs);
         merger.connect();
         merger.start();
         } catch (Exception ex) {
         System.err.println("Failed to merge data sources: " + ex);
         System.exit(-1);
         if (merger == null) {
         System.err.println("Failed to merge data sources");
         System.exit(-1);
         try {
         Player p = Manager.createPlayer(merger);
         new com.sun.media.ui.PlayerWindow(p);
         } catch (Exception e) {
         System.err.println("Failed to create player " + e);
         // Membuat output dari processor
         ProcessorModel outputPM = new MyPMOut(merger);
         try {
         outputProcessor = Manager.createRealizedProcessor(outputPM);
         outputDataSource = outputProcessor.getDataOutput();
         } catch (Exception exc) {
         System.err.println("Failed to create output processor: " + exc);
         System.exit(-1);
         try {
         outputLocator = new MediaLocator(outputFile);
         outputDataSink = Manager.createDataSink(outputDataSource, outputLocator);
         outputDataSink.open();
         } catch (Exception exce) {
         System.err.println("Failed to create output DataSink: " + exce);
         System.exit(-1);
         outputProcessor.addControllerListener(this);
         outputDataSink.addDataSinkListener(this);
         System.err.println("Merging...");
         try {
         outputDataSink.start();
         outputProcessor.start();
         } catch (Exception excep) {
         System.err.println("Failed to start file writing: " + excep);
         System.exit(-1);
         int count = 0;
         while (!done) {
         try {
              Thread.currentThread().sleep(100);
         } catch (InterruptedException ie) {
    try {
    if (myProgressBar.cancelled) {
    try {
    outputDataSink.stop();
    outputProcessor.stop();
    done = true;
    System.out.println("Stopped merge datasink");
    } catch (Exception e) {
    System.out.println("Couldn't stop merge datasink");
    System.out.println(e);
    } catch (Exception e) {}
         if (outputProcessor != null &&
              (int)(outputProcessor.getMediaTime().getSeconds()) > count) {
              System.err.print(".");
              count = (int)(outputProcessor.getMediaTime().getSeconds());
         if (outputDataSink != null) {
         outputDataSink.close();
         synchronized (this) {
         if (outputProcessor != null) {
              outputProcessor.close();
         System.err.println("Done!");
    public void controllerUpdate(ControllerEvent ce) {
         if (ce instanceof EndOfMediaEvent) {
         synchronized (this) {
              outputProcessor.close();
              outputProcessor = null;
    public void dataSinkUpdate(DataSinkEvent dse) {
         if (dse instanceof EndOfStreamEvent) {
         done = true;
         } else if (dse instanceof DataSinkErrorEvent) {
         done = true;
    class MyPM extends ProcessorModel {
         MediaLocator inputLocator;
         public MyPM(MediaLocator inputLocator) {
         this.inputLocator = inputLocator;
         public ContentDescriptor getContentDescriptor() {
         return new ContentDescriptor(ContentDescriptor.RAW);
         public DataSource getInputDataSource() {
         return null;
         public MediaLocator getInputLocator() {
         return inputLocator;
         public Format getOutputTrackFormat(int index) {
         return null;
         public int getTrackCount(int n) {
         return n;
         public boolean isFormatAcceptable(int index, Format format) {
         if (videoFormat == null) {
              videoFormat = new VideoFormat(videoEncoding);
         if (audioFormat == null) {
              audioFormat = new AudioFormat(audioEncoding);
         if (format.matches(videoFormat) || format.matches(audioFormat))
              return true;
         else
              return false;
    class MyPMOut extends ProcessorModel {
         DataSource inputDataSource;
         public MyPMOut(DataSource inputDataSource) {
         this.inputDataSource = inputDataSource;
         public ContentDescriptor getContentDescriptor() {
         return new FileTypeDescriptor(outputType);
         public DataSource getInputDataSource() {
         return inputDataSource;
         public MediaLocator getInputLocator() {
         return null;
         public Format getOutputTrackFormat(int index) {
         return null;
         public int getTrackCount(int n) {
         return n;
         public boolean isFormatAcceptable(int index, Format format) {
         if (videoFormat == null) {
              videoFormat = new VideoFormat(videoEncoding);
         if (audioFormat == null) {
              audioFormat = new AudioFormat(audioEncoding);
         if (format.matches(videoFormat) || format.matches(audioFormat))
              return true;
         else
              return false;
    private void showUsage() {
         System.err.println("Usage: Merge <url1> <url2> [<url3> ... ] [-o <out URL>] [-v <video_encoding>] [-a <audio_encoding>] [-t <content_type>]");
    private void parseArgs(String [] args) {
         int i = 0;
         while (i < args.length) {
         if (args[i].equals("-h")) {
              showUsage();
         } else if (args[i].equals("-o")) {
              i++;
              outputFile = args[i];
         } else if (args[i].equals("-t")) {
              i++;
              outputType = args[i];
         } else if (args[i].equals("-v")) {
              i++;
              videoEncoding = args[i];
         } else if (args[i].equals("-a")) {
              i++;
              audioEncoding = args[i];
         } else {
              sourcesURLs.addElement(args[i]);
         i++;
         if (outputFile == null) {
         outputFile = "file:" + System.getProperty("user.dir") + File.separator + "merged.avi";
    public static void main(String [] args) {
         new Merge(args);
         System.exit(0);
    ii can't find the problems, can anyone help me?
    Thanks,
    Shin

    Even manually I configured & realized the processors for input files.
    My application is getting blocked in following mentioned snippet of code.
    while (!done)
             if (outputProcessor != null
                  && (int) (outputProcessor.getMediaTime().getSeconds()) > count)
              count = (int) (outputProcessor.getMediaTime().getSeconds());
              logger.debug("Merging is in progress...");
         }This is happening when I try merge large video , audio files only. Is this limitation of JMF or any mistake in app.
    Please suggest any idea or link by which I get some solution.

  • OT: What viewers do you need to use ALL audio and video files?

    I had a lot of trouble with audio and video files on my PC. I
    now have a
    MacBook Pro, which is a big improvement. But I still
    frequently
    encounter files that I can't use. I try to open them, but
    Apple doesn't
    know what to do with them. For example, I often encounter
    unusable
    videos while perusing news websites. I also have some audio
    files with
    .mva or .mvw extensions that don't work.
    So what viwers/players do I need to download in order to be
    able to play
    every audio/video file that comes my way? iTunes handles .mp3
    files, and
    I also have QuickTime installed. I just viewed another video
    after I
    installed a Flash plugin. What else do I need to install?
    Thanks.
    www.geobop.org - Family Websites
    www.invisible-republic.org - Adult political websites (Mature
    adults only)

    VLC is a big favorite
    http://www.macupdate.com/app/mac/5758/vlc-media-player

  • Merge audio to video clips, end of clip loses audio

    This is driving me nuts. After merging audio and video clips, sync to slate using the in points, everything is in sync.
    The audio is longer than the video itself. But WAY before the scene ends, the audio mutes. The waveforms are visable but no audio plays. It drop out. It happens to every merged clip I create. I don't use the out points either.
    Anyone else have this issue??
    If I take an alternate method to sync them in the timeline and create a clip from there, it makes a new sequence clip losing the scene and take metadata. That's not good. Is there a workaround?

    It happens on every project and different formats. I do use FCP Rescue and that doesn't solve it. I did figure out a work around which involves a few more steps and is more time consuming. Maybe this can help find the problem. Here is the ridiculous workaround steps. We should send this to Apple.
    The "--" are the extra steps.
    To get a good merged subclip.
    Open the audio file and add the 'in point' to the slate.
    --Drag audio clip to the timeline.
    --Drag the audio clip from the timeline to the browser window.
    Merge with video.
    Viola. Audio won't drop out. But the waveforms are completely off with the sound.
    Next step is to open the merged clips and Make Subclip so you don't have that extra media in the beginning and end. And the waveform seem normal.

  • Can i take both Receveing of audio and video on the same window.

    hi,
    i m using AVTransmit2 and AVReceive2 for Transmission and Receiving all is doing well but problem is that AVReceive2 show Transmission of audio and video in both different windows and it use different players for audio and video. Can u tell me can i merge both audio and video at the receiving end. i m already mergeing audio and video at transmission end ???. any one tell me how can i solve this problem ???
    thx in advance.

    Hi,
    I have the same situation - to use one window to show both rtp audio and video stream, but use the ControlPanelComponent of audio player to control both players.
    I want to use the audio player as the master player so that the video player could be synchronized with it. So: audioPlayer.addController(videoPlayer).
    In the only window frame for both players, I want to show the VisualComponent(VC) of video and the ControlPanelComponent(CC) of audio. So I add both components respectly to the window frame, after I got them from each player. Because the streams are coming from the network over RTP, and they do not become realized at the same time, I add the VC and CC of first realized player(no matter it is audio or video) to window frame at first, and then change them to the desired video VC and audio audio CC.
    Though I check both player are realized and not started before I call the addController() method, the application stucked in this method. Of course I call it in a try block but no exception was thrown out.
    when I use TimeBase to synchronize them, i.e., videoPlayer.setTimeBase(audioPlayer.getTimeBase()), everything goes fine. But, I can not control both players from one ControlPanelComponent in this way.
    Anybody having any idea to solve this problem please reply.
    By the way, in the example code such as AVReceiver, where audo and video streams are shown on two window seperately, user can use the ControlPanelComponent of audio to control both player, though it seems no relevant code to support that function. The ControlPanelComponent of video can not control the audio player anyway. Anyone having any explanation to this?

  • Having trouble Audio and Video Syncing on iTune

    Having trouble Audio and Video Syncing on iTune anyone know how to repair?

    Here are two suggestions:
    If everything is pretty much in sync until 17 minutes in, do a re sync every ten minutes.  This should ensure that, even if the sync drifts by a few frames over 10 mins, the video will not seem distinctly out of sync at any time.
    Another approach would be to do an overall re-sync in QuickTime Pro (which I believe is part of the FCP package).
    You would need to use QTPro to trim,the video file to remove any surplus seconds of material at the beginning and end.  Then open the audio file in QT Pro and trim that if necessary.  It may be a slightly more complicated to do this at the beginning because, if the audio does not come in immediately, you'll need to make sure that you have the right amount of silence at the start of the file to match the video before the audio begins. This should leave you with video and audio files of the same length.
    Then, in the drop menus of your QT, select and copy the audio track.  Switching to the QT video file, in the drop down menus, click on add track. 
    Hope one or other of these May be some help to you.

  • Problem with merging incmoing audio and video source

    I am trying to record incoming audio and video data but i ran into a weird problem. The datasources merged fine but during playback of the recorded file the video sometimes turns white while audio still goes. For example, if the audio duration is 10 mins , the file length will be 10 mins but the video shows only like 2 mins and then the rest of file is white with sound in background. i am not sure why its happening... i also tried merging the files after recording audio and video individually but still no\ luck. does anyone know why this is happening? thanks

    I am working on Audio/video conferancing . Before sending Audio/Video data from my side
    to other side I am setting SendStream's setSourceDescription with my own name for CNAME . On reciving side some time i am getting correct CNAME which i set to SourceDescription ,but some time it gives default CNAME which is username@computername in steed of giving CNAME set to my name .
    Can you help me to solve this problem
    Thanks & Regards
    Amol Chandurkar

  • MERGING AND SYNCING more then one audio and video clip?

    Is there are way to select a bin of video clips and a bin of audio clips (DSLR with separate audio) and have Premiere analyze and sync them in batches like Avid. I have a mountain of audio and video clips, but with no rhyme or reason to the file name, no timecode, no slates, just waveforms. But without knowing which audio files go with which video files, it’s impossible (or least time-consuming).
    Or do I have to use plural eyes or other synchronising program for doing that?
    THanks.

    Do the video clips have audio. That is, are you seeking to match A/V clips with audio-only clips? If so, then Premiere can do that. Select all the clips in both bins, right-click & select Create Multicam Source Sequence, and select Sync by Audio.
    Whenever Premiere finds that two or more clips match each other but no others in the batch, it creates a multicam source sequence of just those clips.

  • Audio and video out of sync on converted AVI file

    I am trying to convert a DVD to an AVI for playback on a device. When I import the DVD and publish it as an .AVI the output audio and video are out of sync. After about an hour of playback the audio lags the video by a full second. How can I correct this problem?

    Walter0325
    If you have Premiere Elements 12 on Windows 8 and you want to rip VOBs from the DVD-VIDEO to export subsequently to DV AVI, here are the details....
    1. Do you have DVD-VIDEO standard or widescreen? Set the Premiere Elements 12 project preset manually.
    File Menu/New/Project. Set the project preset to NTSC or PAL DV Standard or NTSC or PAL DV Widescreen depending on what you have.
    Before you exit the new project dialog, make sure that you have a check mark next to Force Selected Project Setting on this Project.
    2. In the Premiere Elements workspace, use Add Media/DVD camera or computer drive/Video Importer to get the VOBs (only the ones in the series VTS_01_1.VOB. Set to a folder in the Save In: location. Check mark next to Insert in Timeline. Hit Get Media.
    3. If you have an orange line OVER the Timeline content, press the Enter key to get the best possible preview of what you have on that Timeline.
    If after all that, the audio is out of sync, then use the Command Prompt method to get the individual VOBs into a single file with all the VOBs seamlessly merged. This worked nicely for a recent user here. See details in my blog post how to for the Command Prompt way.
    http://www.atr935.blogspot.com/2013/09/pe-dvd-videoseamless-vob-ripping.html
    Please do not hesitate to ask if you need clarification on anything that I have written.
    Depending on your results, more information may be requested about the specific properties of what you have on that DVD disc.
    Thank you.
    ATR

  • How to combine discrete audio and video files?

    I am working on a project which requires me to merge a video file with a PCM audio file from another source.
    I just bought Compressor 4.1.1, expecting to be able to do exactly this.
    The only "multiple input - single output" option I can see is creating a surround sound group, adding my audio as left and right channels, and then attaching the video.
    This does not work. Pressing "add" at the end of the process does not result in a new job.
    Help please!
    G.

    QT7 Pro is able to do this. Cost is $30.
    In QT7 Pro, open up the audio and video files to be matched.
    Select all the audio and then select copy
    In the video window, put the playback head where you want the audio to start then select add to movie.
    Verify the playback is as desired, then save the movie. Done.
    Takes longer to describe the process than to do it.
    x
    Of course, life is good when the in point is exactly the same for both audio and video ...

  • Audio and Video on the same page.

    Is it ok to have a separate audio and video file on the same page?  Both are set to auto play with no controls but when I try this set up the audio seams to override the video, stopping the video a few seconds later.
    I guess I could try and embed the audio into the video but if I did not have to I was not going to.
    Any suggestions or help is appreciated.
    Thanks,
    Ryan.

    You can't play both the audio and the video simultaneously, you'll need to merge them.
    Neil

  • Audio and video don't open together in viewer when double clicked

    Hello everyone!
    I'm having an annoying problem... it isn't something terribly important, but has decreased my work flow speed. Recently, I've had some trouble opening clips from the timeline in the viewer with the double click.
    If I double click the video portion of the clip, even if both the video and two audio tracks are highlighted (and are still linked) only the video portion of the clip will appear (along with "filters" and "motion" tabs) in the viewer, and similarly when double clicking the audio tracks, though both video and audio are highlighted, only the audio tracks and "Filters" tab show up in the viewer. Additionally, once the item is double clicked at the left with the jigsaw pieces that can be used to set video and audio destinations, if I've clicked on an audio portion so only audio shows up in the viewer, only the jigsaw pieces for the two audio tracks (a1, a2) are visible, video track (v1) is no longer present, and vice versa.
    I'd appreciate any advice that you all have! It's really driving me batty, in addition to not knowing what I did to make it that way, I can't figure out how to change it back!
    Heather

    Thanks for the advice.
    The clips already appeared to be linked, and the selection "link" was checked, however when I unlinked and then relinked I found that the audio and video would open up at the same time in the viewer... problem apparently solved! However, the clips I had been working with I had to nudge audio and video a few frames apart (out of sync) to get the audio and video to line up. (gives the little red box with a +or- number in it on the clip).
    When I unlinked and then relinked, the clip lost those red boxes and went back to being synced (as far as FCP is concerned) with the audio and video unaligned.
    So, the problem appears to be coming from when I move the audio and video of a clip out of sync... why is this? Is this something I'm just going to have to deal with?
    Thanks
    H

  • Audio and video sync problem

    Hi,
    I have a problem with audio and video synchronization. I'm using Merge class to merge video and audio tracks. But audio track can be shorter then whole video track (for example video length: 2 min., audio length 1 min.). When I merge them everything is OK. After that I'd like to concatenate more tracks (generated from merge) into one final video file. For this purpose I'm using Concat class.
    But here is the problem, after I concatenate all these tracks then the result video is OK but audio is synchronized right after the previous audio track ended. (If I have video1- 2min. audio1 - 1min., video2 - 2min., audio2 - 30 seconds, then second audio2 starts in 1,30 min right after the first audio track. But I want to start it when the second video2 starts.) Do you have any ideas??
    Merge class http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/Merge.html
    Concat class http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/Concat.html
    Thanks for help in advance.

    Ok thanks.
    I tried to implement PullBufferDataSource and everything went ok, the audio .wav file was generated, but when I run that file, the length is 0 and not the lenght I wanted (40.00). Where can be a problem?
    Here is the code I wrote/changed:
    TimeDataSource tds = new TimeDataSource(40.00);
    Processor p;
    try {
        System.err.println("- create processor for the image datasource ...");
        p = Manager.createProcessor(tds);
    } catch (Exception e) {
        System.err.println("Yikes!  Cannot create a processor from the data source.");
        return false;
    }I set content type to WAVE:
    // Set the output content descriptor to QuickTime.
    p.setContentDescriptor(new ContentDescriptor(FileTypeDescriptor.WAVE));I use DataSource as the next:
        class TimeDataSource extends PullBufferDataSource {
                     private double duration;
                     TimeSourceStream streams[];
                     public TimeDataSource(double duration) {
                      this.duration = duration;
                      streams = new TimeSourceStream[1];
                  streams[0] = new TimeSourceStream(40);
              @Override
              public PullBufferStream[] getStreams() {
                   return streams;
              @Override
              public void connect() throws IOException {
              @Override
              public void disconnect() {
              @Override
              public String getContentType() {
                   return ContentDescriptor.RAW;
              @Override
              public Object getControl(String arg0) {
                   return null;
              @Override
              public Object[] getControls() {
                   return new Object[0];
              @Override
              public Time getDuration() {
                   return new Time(duration);
              @Override
              public void start() throws IOException {
              @Override
              public void stop() throws IOException {
        class TimeSourceStream implements PullBufferStream {
                   private AudioFormat format;
                   private boolean ended = false;
                   private long duration;
                   public TimeSourceStream(long duration) {
                        this.duration = duration;
                        format = new AudioFormat(AudioFormat.LINEAR, 8000, 8, 1);
              @Override
              public Format getFormat() {
                   return format;
              @Override
              public void read(Buffer buf) throws IOException {
                   buf.setDuration(duration);
                   buf.setFormat(format);
                   buf.setEOM(true);
                   buf.setOffset(0);
                   buf.setLength(0);
                   ended = true;     
              @Override
              public boolean willReadBlock() {
                   return false;
              @Override
              public boolean endOfStream() {
                   return ended;
              @Override
              public ContentDescriptor getContentDescriptor() {
                   return new ContentDescriptor(ContentDescriptor.RAW);
              @Override
              public long getContentLength() {
                   return 0;
              @Override
              public Object getControl(String arg0) {
                   return null;
              @Override
              public Object[] getControls() {
                   return new Object[0];
        }Edited by: mako123 on Feb 22, 2010 2:18 PM
    Edited by: mako123 on Feb 22, 2010 2:23 PM

Maybe you are looking for

  • Line spacing in acrobat pro X

    forms, made one in Pro X, to be used by someone in Reader where can I find line spacing capabilities - (form is then enabled to use in Reader) looked all over for hours to find answer and nothing* set font at Helv 9 point and the line spacing looks a

  • Third-Party apps can't connect to public wifi  networks.

    When I connect my iPhone to my university's Wi-Fi connection, Apple Apps (Safari, Mail, iTunes, App Store...) all function normally. But when it comes to third-party apps I can't get them to connect to the internet. So is there a workaround to this?

  • Remove search bar

    I can type my searches in the address bar. Why do I need a separate search bar next to it? How do I get rid of it? And, I can't find the RSS Feeds button anymore. And, one more thing - your "help" button ought to be replaced with a "this is NO help"

  • COPA Profitability document values

    I have one (1) price condition mapped to a COPA field VVPND. (Cost-based COPA) When I create an invoice and look at the generated Profitability document, it populates a different value than the price condition's. I put a breakpoint on User-Exit ZXKKE

  • JAZNInitException error - Starting OC4J ---- Nobody answers??? :(

    Hi! I've installed Oracle Developer Suite 10.1.2.0.2 (OC4J embedded), without patches at this moment. When I try starting OC4J (startinst.bat), it gives me the following error: C:\oracle\oradev10g\j2ee\DevSuite>C:\oracle\oradev10g\jdk\bin\java  -Dora