RTP Tranmission Problem

Dear Friends
I am working with JMF to transmit a video (.avi) file from one computer to another computer in tcp/ip network. computers are connected in peer to peer architecture only. when i run the transmission server code it works and transmits ( i verified it in same computer ) but in remote computer (connected in same network)... either it comes after a long time - that also not fully.. or it just waits for the data.. so..no transmission.. i have to show a demo to my staff.. please help me dear friends.. regarding this problem if your have any questions or doubts please contact me in [email protected] or post your replies here.. i am waiting ...
with regards
Muthukumar

when i run the transmission server code it works and transmits ( i verified it in same computer ) I don't understand, how do you know it is transmitting over network on same computer??
Are you using jmstudio??
file transmit to ip and port of destination computer
destination computer open ip and port of destination computer.
or transmit to xxx.xxx.xxx.255 portNum
and receive on xxx.xxx.xxx.255 portNum

Similar Messages

  • H263 RTP and JPEG RTP buffering problem

    Hi
    I have a problem with buffering received video stream. I use a buffercontrol of rtpmanager and for audio stream eveything works fine but for video buffercontrol is null, sa i can't set it (the same problem is with player or porcessor). Audio stream has all parameters (bitrate, samplerate etc) but video has only encoding and size parameters.
    I tried to change format (by processor) of received stream to supported format (for example jpeg) and set all parameters (size, framerate, maxdatalength and encoding) and use output datasource but still player can't get buffercontrol. When i use a porcessor as a player i can see in media properties that it is set like i want, but buffercontrol is null.
    I think that h263_rtp and jpeg_rtp video format don't support frameate (i tried set it but nothing happens) and it's played as it comes.
    I need to set a buffer because the framerate of incoming stream is really low (it's p2p application) and smothnes of video is bad.
    Please helpme where i can find any suggestion to solve this problem. Maybe i have to use custom transport layer or something like that.

    cziz13 wrote:
    I don't know what you want to say. I said exactly what I wanted to say.
    i got buffercontrol on realized player (it cals rendering and it's in solution AudioBufferControl.java). I got the buffer even on video file, but only when i play it from my hd. when i try to do it on received video stream the buffer is not working.Good for you. But that wasn't my point.
    Whenever you request a "Player" object, you don't get a "Player" object, you get some specific "Player" subclass that, depending on what it's designed to do, will implement certain interfaces.
    The BufferControl object you're getting on your Player class that will play from the hard drive is more of a prefetch cache than a buffer (in the sense that I think of a buffer, at least). It's using a pull datasource, so it can control getting data from its datasource.
    However, the player object that's playing a RTP stream is relying upon a pull buffer data source. It has absolutely no control over the data, and it's told by the DataSource when data is ready to be played...so it just loads when it's told to do by the DataSource.
    And the datasource is just handed data from the RTP streams inside the RTPManager based on the internal RTP buffering that is handled by the RTPManager...
    Any more suggestions?My original "suggestion" still stands. Get your BufferControl object on the RTPManager that you're using to receive the stream...

  • RTP transmition problems!!!

    I write a application that send video stream to the other PC. The RTP transimition code is like the AVTransmit sample. It works well.but when I want to close or shut down my RTP transmition, there is a big problem that how can I close it absolutely without qiut the application. The close method is like this:
    public void stopTrans()
    //synchronized (this)
    if (processor != null)
    processor.stop();
    processor.close();
    processor = null;
    for (int i = 0; i < rtpMgrs.length; i++)
    rtpMgrs.removeTargets("Session ended.");
    rtpMgrs[i].dispose();
    rtpMgrs[i] = null;
    rtpMgrs = null;
    when I press the send button to start a new transmition again, there is a exception said:
    Can not create RTP session: javax.media.rtp.InvalidSessionAddressException: Can't open local data port: 3000
    port 3000 is the port number I used last time. So if I want to create transmition again using the same port, it failed. Why , how can I release the port form a RTP transmition? Please help me. Thank you very much.

    I have exactly the same problem with Rick.T, that is I cant use the same port 2nd time.... I get the message "can't open local data port xxxxx"
    For every unicast session I use 4 ports per client, 2 to transmit (video-audio) 2 to receive (video-audio). So, it doesnt make sense to use every time new ports...
    I have tried to close/stop/deallocate the Processor, to stop/disconnect DataSource, to stop/close SendStream and of course to removeTargets/dispose the RTPManager.... But I still face the same problem
    Its the very final stage of my project and I dont know what else to try....
    I have searched for other similar topics here but there is no answer to that problem.
    Is there anyone who has the solution to release the ports??
    thanx in advance

  • G723_RTP decoding from rtp stream problem

    HI,
    I've managed to establish rtp connection and send the audio stream in G732 format. The stream is ok, but I can't decode the response from the asterisk server. I use the same way to set the audio format for both sending and receiving:
    TrackControl[] tcs = processor.getTrackControls();
    if(tcs.length == 0) {
         return; //should not enter this case
    TrackControl tc = tcs[0];//we expect one channel only
    String codecType = SoftPhoneSettings.audio_codec;
    tc.setFormat(new AudioFormat(AudioFormat.G723_RTP, 8000.0, 16, 1, -1, -1, 192, -1.0, null));
    tc.setEnabled(true);If I use the gsm_rtp or ulaw_rtp codec it works just fine on both sides (also decoded right by me), but with G723_RTP I here no voice. And also I got EXCEPTION_ACCESS_VIOLATION on processor.close();, which is also only if using this codec.
    The asterisk server is working fine, but I can't figure out what exactly is the problem here, and why I send it correctly but I can't decode the response. I've also tried with setCodecChain, but the situation is the same. Can anybody help me with this, because I can't find the source of the problem and more important I still can't solve it. 10x in advance.

    Is there any standard way for retrieving the G723_RTP stream, like using a player or?

  • Video RTP transmittion problem

    Hey guys (and girls),
    I'm writing two programs that either send or receive a RTP video stream captured with a webcam. Using JMStudio I can capture and transmit the video and happily receive it with my client program, but when I try to capture and transmit the video using my server program the client can't realize because it's not got any data.
    Here is my transmitter code, my server class creates a new instance of the camTransmitter class.
    import java.io.*;
    import java.awt.*;
    import java.util.Vector;
    import javax.swing.*;
    import java.awt.event.*;
    import javax.media.*;
    import javax.media.format.*;
    import javax.media.protocol.*;
    import javax.media.control.*;
    import javax.media.ControllerListener;
    import javax.media.rtp.*;
    import javax.media.rtp.rtcp.*;
    import javax.media.rtp.event.*;
    import javax.media.rtp.SendStreamListener.*;
    import java.net.InetAddress;
    public class camTransmitter implements ControllerListener
         Processor p = null;
         CaptureDeviceInfo di = null;
         boolean configured, realized = false;
         public String ipAddress = "192.168.123.150";
         public int port = 22222;
         public RTPManager rtpMgrs[];
         public DataSource vDS;
         public SessionAddress localAddr, destAddr;
         public InetAddress ipAddr;
         public SendStream sendStream;
        public camTransmitter()
         // First, we'll need a DataSource that captures live video
         Format vFormat = new VideoFormat(VideoFormat.RGB);
         Vector devices = CaptureDeviceManager.getDeviceList(vFormat);
         if (devices.size() > 0)
              di = (CaptureDeviceInfo) devices.elementAt(0);
         else
              // exit if we could not find the relevant capture device.
              System.out.println("no devises");          
              System.exit(-1);
         MediaLocator camLoctation = di.getLocator();
         // Create a processor for this capture device & exit if we
         // cannot create it
         try
              vDS = Manager.createDataSource(camLoctation);
              p = Manager.createProcessor(vDS);
              p.addControllerListener(this);
         catch (IOException e) {
              System.exit(-1);
         }catch (NoProcessorException e) {
              System.exit(-1);
         }catch(NoDataSourceException e) {
              System.exit(-1);
         // at this point, we have succesfully created the processor.
         // Realize it and block until it is configured.
         p.configure();
         while(configured != true){}
         // block until it has been configured
         p.setContentDescriptor(new ContentDescriptor(ContentDescriptor.RAW_RTP));
         TrackControl track[] = p.getTrackControls();
         boolean encodingOk = false;
         // Go through the tracks and try to program one of them to
         // output ULAW_RTP data.
         for (int i = 0; i < track.length; i++)
              if (!encodingOk && track[i] instanceof FormatControl)
                   if (((FormatControl)track).setFormat(vFormat) == null)
                        track[i].setEnabled(false);
                   else
                        encodingOk = true;
              else
                   // we could not set this track to gsm, so disable it
                   track[i].setEnabled(false);
         // Realize it and block until it is realized.
         p.realize();
         while(realized != true){}
         // block until realized.
         // get the output datasource of the processor and exit
         // if we fail
         DataSource ds = null;
         try
              ds = p.getDataOutput();
         catch (NotRealizedError e)
              System.exit(-1);
         PushBufferDataSource pbds = (PushBufferDataSource)ds;
         PushBufferStream pbss[] = pbds.getStreams();
         rtpMgrs = new RTPManager[pbss.length];
         for (int i = 0; i < pbss.length; i++) {
         try {
              rtpMgrs[i] = RTPManager.newInstance();
              ipAddr = InetAddress.getByName(ipAddress);
              localAddr = new SessionAddress(InetAddress.getLocalHost(),port);
              destAddr = new SessionAddress(ipAddr, port);
              System.out.println(localAddr);
              rtpMgrs[i].initialize(destAddr);
              rtpMgrs[i].addTarget(destAddr);
              System.out.println( "Created RTP session: " + ipAddress + " " + port);
              sendStream = rtpMgrs[i].createSendStream(ds, i);          
              sendStream.start();
         } catch (Exception e) {
              System.err.println(e.getMessage());
         //System.out.println("RTP Stream created and running");
    public void controllerUpdate(ControllerEvent e)
         if(e instanceof ConfigureCompleteEvent)
              System.out.println("Configure Complete");
              configured = true;
         if(e instanceof RealizeCompleteEvent)
              System.out.println("Ralizing complete");
              realized = true;
    When I run this the processor realizes and the RTP seems to stream, but I have no idea why I can't receive the video from this.
    Pls help if anyone can Thanks

    Dear andreyvk ,
    I've read your post
    http://forum.java.sun.com/thread.jspa?threadID=785134&tstart=165
    about how to use a single RTP session for both media reception and trasmission (I'm referring to you RTPSocketAdapter modified version), but, at the moment, I'receive a BIND error.
    I think that your post is an EXCELLENT solution. I'modified AVReceive3 and AVTransmit3 in order to accept all parameters (Local IP & Port, Remote IP & Port).
    Can you please give me a simple scenario so I can understand what the mistake?
    I'use AVTransmit3 and AVReceive3 from different prompts and if I run these 2 classes contemporarely both in 2 different PC (172.17.32.27 and 172.17.32.30) I can transmit the media (vfw://0 for example) using AVTransmit3 but I'can't receive nothing if I run also AVReceive3 in the same PC?
    What's the problem? Furthermore, If I run first AVReceive3 from a MSDOS Prompt and subsequently I run AVTransmit3 from another prompt I see a BIND error (port already in use).
    How can I use your modified RTPSocketAdapter in order to send and receive a single media from the same port (e.g. 7500).
    I've used this scenario PC1: IP 172.17.32.30 Local Port 5000 and PC2:IP 172.17.32.27 LocalPort 10000
    So in the PC1 I run:
    AVTransmit3 vfw://0 <Local IP 172.17.32.30> <5000> <Remote IP 172.17.32.27> <10000>
    AVReceive3 <Local IP 172.17.32.30/5000> <Remote IP 172.17.32.27/10000>
    and in PC2:
    AVTransmit3 vfw://0 <Local IP 172.17.32.27> <10000> <Remote IP 172.17.32.30> <5000>
    AVReceive3 <Local IP 172.17.32.27/10000> <Remote IP 172.17.32.30/5000>
    I'd like to use the same port 5000 (in PC1) and 10000 (in PC2) in order to transmit and receive rtp packets. How can i do that without receive a Bind Error? How can I receive packets (and playing these media if audio &/or video) from the same port used to send stream over the network?
    How can I obtain a RTP Symmetric Transmission/Reception solution?
    Please give me an hint. If you can't post this is my email: [email protected]

  • Rtp session problem

    HI, I’m using a custom data source for jmf to send audio over a RTPManager using javasound for capturing live audio from a microphone and at the same time receiving audio from a RTP session using jmf, the code for the custom data Source is one posted in here a while ago which is working fine the problem is that when I start sending audio the incoming audio stop but the sound is send correctly to the other part, so if a don’t send audio I get the audio from the rtp session if I send audio I don’t listen the incoming audio, I’m only talking of one side of the call because the other part is sending the rtp session by a sip server that is connected to a regular phone line and I don’t know exactly how that work I only know that works. anyone know what can be wrong with this? Can it be a buffer problem? Or a blocking device or just a error on the code? Thanks in advance.

    Your question doesn't make a whole lot of sense, but I'd assume you've coded something incorrectly.

  • Rtp transmitting problem

    i am passing with a problem.......
    i read in JMF api that are three methods to transmit voice onthe network
    1-e a MediaLocator that has the parameters of the RTP session to construct an RTP DataSink by calling Manager.createDataSink.
    2-e a session manager to create send streams for the content and control the transmission.
    3-through sockets..
    i got the code for ist method of datasinking....
    but i dont know that how to get this string....
    that is url
    // hand this datasource to manager for creating an RTP
    // datasink our RTP datasimnk will multicast the audio
    try {
    String url= "rtp://224.144.251.104:49150/audio/1";
    MediaLocator m = new MediaLocator(url);
    DataSink d = Manager.createDataSink(ds, m);
    d.open();
    d.start();
    } catch (Exception e) {
    System.exit(-1);
    }

    I've faced a similar problem, what I did was to use TWO session managers, one for transmitter and the other for receiver. I then used this objects in a container.
    Both peer use the same port, and know each other's address.
    For receiving data from the peer I do:
    RTPManager mymgr = RTPManager.newInstance();
    // add as listener to create players for each incoming stream
    mymgr.addReceiveStreamListener(this);
    SessionAddress localaddr =new SessionAddress( InetAddress.getLocalHost(), port);
    SessionAddress sessaddr = new SessionAddress( peerAddress, port);
    mymgr.initialize(localaddr);
    mymgr.addTarget( sessaddr);For transmitting data to the peer I do:
    RTPManager mymgr = RTPManager.newInstance();
    // add as listener to create players for each incoming stream
    mymgr.addReceiveStreamListener(this);
    SessionAddress localaddr =new SessionAddress( InetAddress.getLocalHost(), SessionAddress.ANY_PORT);
    SessionAddress sessaddr = new SessionAddress( peerAddress, port);
    mymgr.initialize(localaddr);
    mymgr.addTarget( sessaddr);
    // the processor is capturing audio from the mic
    processor.start();
    SendStream sendStream = mymgr.createSendStream(processor.getDataOutput(), i);
    sendStream.start();I probably forgot something when doing copy paste from my code... but I think this might help you
    Good luck
    Marco

  • RTP transmission problem.

    Hi,
    I developed a audio/video conferencing application which transmit the rtp data from server to all clients. The problem I facing is the system working perfectly on my home network. but if I try in my university network it doesnt work.
    I use the IP address 192.168.0.255 for the transmitting server, and the client would use the IP of the server to access the data. In univasersit environment, I change the IP address accordingly.
    What would be the cause of this problem. Is it probably because of my university network settings?
    Plz help me ...thanks

    I've understood my problems...My error was to not close the Player objects created by Receiver...
    I thought that closing RTPManager, i'll close the associated Player object, but this is not true..
    Each Player object need to be closed...

  • Re: RTP transmission problem

    Is it necessary to use AVTransmit3 and RTPSocketAdapter classes for your project..
    If not, for your requirement, you can use AVTransmit2 and you can add a remoteSessionAddress whenever you need.
    If you want to use the AVTransmit3 and RTPSocketAdapter, keep the createTransmitter in separate class, and create a object whenever you want (say new avater met), here u have to careful to use the local datasource...
    Karthikeyan R

    I've understood my problems...My error was to not close the Player objects created by Receiver...
    I thought that closing RTPManager, i'll close the associated Player object, but this is not true..
    Each Player object need to be closed...

  • RTP synchronization problem

    I tried to send movie from server to client. I used RTPManager to receive audio and video streams from client applet. They work fine and client can receive both streams. However these two streams cannot sychronize. The time gap between two streams around 2 secs. How can I synchronize audio and video streams?
    Thanks a lot.

    Hi, I'm doing a project nearly the same as u all mentioned. But I am facing a problem is that my client side which using the applet cannot find it's IP, the IP it gets is localhost/127.0.0.1, as for this, i even cannot load up the player to connect to the server. As I c u guys already can stream the video and audio out, I'm wondering how u guys can get the IP of the client side.
    ps. The applet is put in the IIS of the server.

  • Problem with buffercontrol of rtpmanager

    Hi
    When i receive stream in rtp session i set up the buffer length of buffercontrol.
    It works good but when i play a datasource from this stream at begining it plays too fast (it plays everything what it has in buffer ) and than slows down (and plays in rate that stream incoming).
    I want to set big buffer and plays at the same framerate all the time (to play from the buffer without speeding). I used treshold but effect is the same.
    Problem is that (p2p application) when i want to transmit received stream from B to C (i clone the received data source) the frame rate is lower than original datasource. So i want to store more in buffer to play at the same ratio. How i can prevent player from speeding up.
    Could anyone help me?

    cziz13 wrote:
    Any suggestion where i can find any solution?My "guess" as to why there's the RTP acceleration problem is...
    In "real-time" if you drop a frame, you have to wait for the next one to arrive. So, there is an inherant "delay" where the frame is supposed to go.
    In "buffer-time", if you're missing a frame, it'll go immediately to the next frame without a delay where the dropped frame goes. If you drop a reasonable number of frames (say you drop 5 frames out of every 30) then you essentially have a 15% speedup...
    So, there is no solution based on the framerate because the player is playing the correct framerate...it's just not waiting during the dropped frames because, how the hell would it know to do that?
    There are 2 "solutions" to this problem.
    The first,
    [http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/CustomPayload.html]
    And writing a custom depacketizer for a video format that would, essentially, duplicate the last-received frame whenever a skip in frame number was detected. So if frames 15 and 16 were dropped, you'd put 3 copies of frame 14 into the video. This would ensure that the video would play back at the correct speed.
    The second,
    [http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/RTPConnector.html]
    You would swap out the UDP socket in that example for a TCP socket. This would ensure that no packets are dropped, and thus no frames are dropped, and therefore your video won't be missing any data.

  • No voice during incoming outgoing calls

    Hi, if anyone faced this issue before....there is no voice heard during incoming outgoing calls. However, we are able to talk on Skype or listen to music.

    To me this sounds like the RTP stream problem - it probably does not pass the firewall / nat on the way between SIP provider and you.
    What may have happenned is, that either on your Belkin router/NAT, or on internet ISP's router/NAT the SIP ALG (Application Layer Gateway) was enabled.
    If this is the case,  you need to disable SIP ALG on your router and/or ask your internet ISP to disable it on the firewall router.
    Another issue may have been if you have (accidentaly) changed the port forwarding on your Belkin router, and/or the PAP2T local LAN IP address changed and the port forwarding no longer works.
    So my proposals to check would be :
    - check and DISABLE SIP ALG on your Belkin (if router has this feature)
    - check the port forwarding on your Belkin router ... IF you setup port forwarding for 5060/61 you MUST set the same for RTP ports (16384-16483)
    - if you don't have port forwarding on your Belkin, try to setup forward of 5060-5061 AND 16384-16483*.
    *(If this is too many ports, set the the RTP portrange on PAP2T to 16384-16394 for example and then forward only that range)
    - if neither of above helps, change your Line1 SIP port from 5060 to let's say 6070, and RTP ports to the range let's say 17300 - 17310
    - last you may do on your side is to remove all port forwarding and put the PAP2T to the DMZ settings of your Belkin router.

  • Multicast in JMStudio??

    Hi!
    I want to send a video stream to a multicasta address.
    I use JMStudio to transmit the stream. If I specify a unicast address everything works fine. But if I specify a multicast address (e.g. 192.168.100.0) the client does not receive anything.
    Does anyone have a solution for this problem?
    Deike

    Hello Deike,
    1) How are you able to multicast the data to thataddress as given in
    your example. typiucally you can't use addr startingfrom 192.x.x.x
    for multicast. I am wondering how you can say thatyou are
    multicasting to this addr. I use an IP address which has 255 instead of the host
    address.
    For example in a class A network it would be
    127.255.255.255,
    class B 149.243.255.255 an in class C (in my case)
    192.168.100.255.
    So the packets are sent to each host in the network.
    That really works.### I guess NOT. You may not be multicasting it but broadcasting the packets. Since yu don't have routers yet, you won't know the traffic. Check it out. Probably thats the reason why the client is able to receive the packets even though they have NOT yet subscribed to multicast group.
    >
    You can't have your client's either to subscribe tothis multicast
    address. If you are trying to use anything in thatrange, then its
    totally invalid.How does my client subscribe to a multicast address?
    And why does he have to?### why does he have to ... hummm. Trickier to answer. I am thinking how,,,,,,, hummm hey c'mon thats how the things have to work I say ,,,
    >
    Refer to rfc1340 for the valid addresses. This rfcmight also address
    your other issue: how to run RTP & the problem withport 4000 bla
    bla..What is rfc1340??rfc docs are the docs, rather specs.. rfc1340 is defined by the Internet Assigned Numbers Assoc (IANA)You got to comply with it..
    This is becoming more of n/wing issue rather than JMF,,, we r discussing irrelevant issue at an improper place. You might find some more info at diff place. Else lemme know..
    >
    2) You can give a quick try by multicasting toaddress 224.0.0.1. All
    your clients should get the data provided you haveONLY one IP
    router/ IP switch between them.
    3) make sure your IP routers/ IP switches can handlethe multicasted
    data & process them correctly. probably you may haveto configure
    them to do so.I haven't used a router yet because my server is part
    of the two networks in which I tested my applet.### try going thru router & see if the clients still receive the packets,,,
    >
    4) By the way what are you trying to multicast, isit MP3 audio ? Did
    you solve your MP3 issue ?QuickTime Movie, only video, no audio.### by the way, I understand yor aplication is streaming video,,, am I right ? If so i think you can help me in another thng. Lemme know abt this.
    dan ku,, bedanktu
    -Chidu

  • Jmf web cam not recognize under ubuntu 12.04

    I installed the jmf2.11e under ubuntu 12.04 following all the steps of the guide jmstudio seems to work well I open a movie in rtp.
    the problem and the web cam is not recognized and when I try to get to recognize capture devices
    java.lang.Error: Can not open video card 1
    ava.lang.Error: Can not open video card 2
    ava.lang.Error: Can not open video card 3
    ava.lang.Error: Can not open video card 4
    ava.lang.Error: Can not open video card 5
    ava.lang.Error: Can not open video card 6
    ava.lang.Error: Can not open video card 7
    ava.lang.Error: Can not open video card 8
    ava.lang.Error: Can not open video card 9
    I wanted to know if it depends on the model of the web cam
    or this version of linux is not supported
    if that were true, there is another way to use the web cam using java and integrate it with the jmf
    my application should use the webcam under linux
    and transporting the contents in rtp,
    All this works fine under Windows
    Fortunately linux should be spotless and without bugs :)

    hello, since no great guru gave me answer, (I doubt that there are still now gone are the good old days)
    I solved using a library http://code.google.com/p/v4l4j/
    I hope this can be of help to someone
    avoid making a night to try jmf that do not work with the web cam in ubuntu 12.04

  • PAP2T - Can't hear any voice

    Hi. Hoping someone can assist me?
    I have PAP2T and connected to my system XP, i can make a call but i can't hear the other party voice but they can hear my voice without any problem!
    so, i can make calls without any problems but i can't hear the other paty voice!
    what should i do for hearing the the other party voive from my phone!
    best Regards

    Hello tbassil,
    sounds to me like a RTP stream problem.
    Check your NAT / port forwarding rules, make sure all ports :
    5060, 5061, 16384-16482 are forwarded to your PAP2T on the NAT router. 

Maybe you are looking for

  • Forefox will not run due to error message '500 Internal Server Error'

    10 days ago the PC had to have its OS reinstalled on Dell's instructions. All other applications we use have been loaded but both Firefox and IE were runing slowly. we uninstalled firefox and reinstalled it - made no difference to speed. Then it star

  • Issue with ADF Application Report Generation

    1) We are developing the web-base ERP application we are using technology as follows a) Operating System Red Hat Enterprise Linux4. b) Application Server Oracle Application Server 10.1.3 c) Oracle ADF is the frame work d) Jasper reports (iReport 3.0.

  • HP bf69wm Wear and Tear question.

    Hello, I own an HP bf69wm. It seems to have a lot of wear and tear next to and on the mouse pad. it seams to be paint peeling off, but I am not sure. Is there a way I can replace this part and/or any paint I can use to fix this issue? thank you!    

  • Movement type wise GL accounting entry report

    Hi, Is there any standard report available in SAP which shows movement type wise GL accounting enty report apart from MB51. Because in MB51 accounting document is avaibale one by one you click the accounting document button with corresponding materia

  • Field MCEKKO-KZKPF in POOL of 2LIS_02_ITM

    Hello Guys, In the data-source 2LIS_02_ITM, in LBWE-> Maintain -> We have a field as KZKPF - Header COunter field. Can anyone please tell me what does this 'Header Counter' field stands for? Does this help us counting the Unique Header Documents with