Video Streaming using Java Media Framework (JMF 2.0)
The JavaTM Media Framework (JMF) is an application programming interface (API) for incorporating audio, video and other time-based media into Java applications and applets.
JMF 2.0 supports media capture and it addresses the needs of application developers who want additional control over media processing and rendering. It also provides a plug-in architecture that provides direct access to media data and enables JMF to be more easily customized and extended. JMF 2.0 is designed to:
Be easy to program
Support capturing media data
Enable the development of media streaming and conferencing applications in Java
Provide access to raw media data
By exploiting the advantages of the Java platform, JMF delivers the promise of "Write Once, Run Anywhere" to developers who want to use media such as audio and video in their Java programs.
JMF architecture is broadly made of
A data source encapsulates the media stream and a player provides processing and control mechanisms. Playing and capturing audio and video with JMF requires the appropriate input and output devices such as microphones, cameras, speakers, and monitors.
Video Formats that are supported include
To send or receive a live media broadcast or conduct a videoconference over the Internet or Intranet, you need to be able to receive and transmit media streams in real-time. This section introduces streaming media concepts and describes the Real-time Transport Protocol JMF uses for receiving and transmitting media streams across the network.
When media content is streamed to a client in real-time, the client can begin to play the stream without having to wait for the complete stream to download. In fact, the stream might not even have a predefined duration--downloading the entire stream before playing it would be impossible. The term streaming media is often used to refer to both this technique of delivering content over the network in real-time and the real-time media content that's delivered.
Streaming media is everywhere you look on the web--live radio and television broadcasts and web cast concerts and events are being offered by a rapidly growing number of web portals, and it's now possible to conduct audio and video conferences over the Internet. By enabling the delivery of dynamic, interactive media content across the network, streaming media is changing the way people communicate and access information.
RTP provides end-to-end network delivery services for the transmission of real-time data. RTP is network and transport-protocol independent, though it is often used over UDP.
An RTP session is an association among a set of applications communicating with RTP. A network address and a pair of ports identify a session. One port is used for the media data and the other is used for control (RTCP) data.
A participant is a single machine, host, or user participating in the session. Participation in a session can consist of passive reception of data (receiver), active transmission of data (sender), or both.
Each media type is transmitted in a different session. For example, if both audio and video are used in a conference, one session is used to transmit the audio data and a separate session is used to transmit the video data. This enables participants to choose which media types they want to receive--for example, someone who has a low-bandwidth network connection might only want to receive the audio portion of a conference.
RTP applications are often divided into those that need to be able to receive data from the network (RTP Clients) and those that need to be able to transmit data across the network (RTP Servers). Some applications do both--for example, conferencing applications capture and transmit data at the same time that they're receiving data from the network.
Being able to receive RTP streams is necessary for several types of applications. For example:
Conferencing applications need to be able to receive a media stream from an RTP session and render it on the console. A telephone answering machine application needs to be able to receive a media stream from an RTP session and store it in a file.
An application that records a conversation or conference must be able to receive a media stream from an RTP session and both render it on the console and store it in a file.
RTP server applications transmit captured or stored media streams across the network.
For example, in a conferencing application, a media stream might be captured from a video camera and sent out on one or more RTP sessions. The media streams might be encoded in multiple media formats and sent out on several RTP sessions for conferencing with heterogeneous receivers. Multiparty conferencing could be implemented without IP multicast by using multiple unicast RTP sessions.
This application aims at letting a user transmit video from one computer to another. The source of the video can be a file, live video using a video capture device or any other source supported by JMF.
We begin with using the JMF API to read the source and convert it to packetized JPEG data. The RTP API implementation included in JMF will then transmit the video using the RTP protocol.
The program starts with a GUI which pops up and it takes 3 parameters
Media Locator : This is any source of the video stream that has to be transmitted. It can be a URL, from a video device (live) or a file that exists locally.
IP Address: IP address of the destination machine. By specifiying a broadcast or multicast address you can either send it all machine s or a select few.
Port : The port number on which the destination receiving application is running. In our case JMStudio itself runs as the receiving application
There are two main classes in this package:
Video Stream is the main application. It presents the GUI interface to call the VideTransmit which is delegated the work of actually transmitting the video.
To run the sample code : do the following
MediaLocator : specify the source of the video
vfw://0 to specify source as video device
java JMStudio rtp://<sourceIP>:<port>/video
And there you are ready to GOOOOO....
You need to download JMF from java.sun.com/products/jmf
You need to install JMStudio
The source video, whether its a file or live video, needs to be in a format that can be converted to JPEG/RTP. Cinepak, RGB, YUV and JPEG are good formats. Other formats may not work due to restrictions inside the processor. Also the dimensions of the video should be a multiple of 8x8. For example, 320x240, 176x144 are good sizes and 240x180, 90x60 are not.