How to generate live audio/video data that can be played back using JMF. This can include translating data received through a non-standard protocol to a format that JMF can recognize. For example, you could be receiving data from a hardware device not supported by JMF. Or you could be using a proprietary protocol that receives packetized audio/video data through an ATM network.
This solution shows you how to create a custom PushBufferDataSource, which is new to JMF2.0. A PushBufferDataSource contains streams of type PushBufferStream. Such a stream typically generates audio/video data that is organized as frames, rather than a continuous stream of bytes. So each Buffer object carries an entire frame of video or a good sized chunk of audio (anywhere between say 25 millisecs to 2 seconds worth). The data can be in a compressed or uncompressed format. The stream needs to advertise the Format of its data through the getFormat method.
The sample code includes two classes - one is a subclass of PushBufferDataSource and the other is an implementation of PushBufferStream. By default the stream generated is a video stream of raw RGB data. You can change the videoData variable in LiveStream.java to false to generate audio data instead.
Requirements
| Platform: | JDK 1.1.6 or later |
| JMF API: | 2.0 or later |
| Implementation: | AJ, WPP, SPP* |
| Other software requirements: | Swing 1.1 |
* AJ = All Java, WPP = Windows Performance Pack, SPP = Solaris Performance Pack
Related Classes
How to run this sample
javac -d . DataSource.java LiveStream.java
Source Code