Microsoft

Use Windows Media 9 Series to create live broadcasts

With Windows Media 9 Server, you can serve up live broadcasts as a multicast, which means that only one stream serves to the clients that request it. Here's how to use the Windows Media Encoder 9 engine (Encoder) to create a live multimedia broadcast from a Web page.

When delivering multimedia presentations across an intranet, most developers think about prerecording content, digitizing it, and delivering it to the masses in the form of a multimedia file (such as an MP3). However, 1,000 concurrent users requesting the same file can put a burden on your network.

With Windows Media 9 Server, you can serve up live broadcasts as a multicast. This means that only one stream serves to the clients that request it. A drawback to live broadcasts is that they're difficult to implement. But, with the Windows Media Encoder 9 Series, this is easier, and the Encoder is an extensible component that you can script to create a more specific solution for the broadcaster. In this article, I'll use the Windows Media Encoder 9 engine (Encoder) to create a live multimedia broadcast from a Web page. I'll also create a client page to receive the broadcast, as well as custom commands and captions.

The Encoder is a scriptable component available as a free download from Microsoft; it's also a stand-alone executable. Encoder will either serve up the live broadcast or push the load off onto a Windows Media Server. In this example, I'll serve the broadcast from my local computer. I'm running Windows XP Pro with IIS 5.0. The first thing to do is to create a configuration file, which is an XML file with a .wme extension.

To ease the burden of creating the configuration file by hand, I start the Encoder and follow the wizard for creating a live broadcast. The first screen asks for the video and audio input devices. I have a Web camera and a microphone, so I select the camera from the drop-down list and use the default device for the audio. Then it asks whether you want to push the content to the Windows Media Server or have clients connect to your computer and pull content.

For my example, I choose Pull. I accept the defaults for the HTTP port and proceed. On the following screen, you can set the quality of the audio and video that you wish to broadcast. (Remember that, the higher the quality, the more bandwidth it requires.) I choose Live Broadcast for video and Voice Quality for audio. The next screen asks you if you wish to archive your broadcast for future use. This is important if you're doing executive broadcasts and wish to store the file. However, on my PC, it would just take up room, so I pass on the archive option. Finally, follow the screens to the end of the wizard and click Finish.

Check to make sure that you're getting a video feed from your camera, and verify that you're getting an audio feed from your microphone by choosing the Audio Panel from the View menu. You should see an audio level indicator that will move up and down when you speak. Click on the Properties button from the toolbar or choose Properties Panel from the View menu. In the Sources tab, select the "Both Device And File" Source From option. Check the Script checkbox. (This is important if you use custom commands.) Save the configuration file and deploy it on your local site.

Now that you have the configuration file, you can script the Encoder engine to create a more specific user experience and avoid having the broadcaster learn how to use the Encoder; however, the Encoder engine must be on the broadcaster's computer.

Here's the HTML page to accomplish the custom broadcast:

<html>
<head>
<script language="JavaScript">
var g_objEncoder = new ActiveXObject("WMEncEng.WMEncoder");
g_objEncoder.Load("http://localhost/config.wme");

function start() {
    g_objEncoder.Start();
}
function stop() {
    g_objEncoder.Stop();
}
function sendURL(url) {
    g_objEncoder.SendScript(0, "URL", url + "&&frame2");
}
function sendCmd(cmd) {
    g_objEncoder.SendScript(0, "TEXT", cmd);
}
function sendCaption(msg) {
    g_objEncoder.SendScript(0, "CAPTION", msg);
}
</script>
</head>
<body>
<button onclick="start()">Begin Broadcast</button>
<button onclick="stop()">End Broadcast</button><br>
<input type="text" name="txtURL" id="txtURL" size="50"><br>
<button onclick="sendURL(txtURL.value)">Go URL</button><br>
<textarea id="txtMsg" rows="5" cols="80" scroll="off"></textarea><br>
<button onclick="sendCaption(txtMsg.value.replace(/\n/g, '<br>'))">Send
 Caption</button><br>
<button onclick="sendCmd('showAlert')">Show Alert Box</button>
</body>
</html>

The reason for doing this as a Web page is simple. It's easier to set up a UI quickly in HTML than to create a custom application. You must allow unsafe ActiveX controls to initialize and run in order for this page to work, but it shows the functionality that's available.

Since all the configuration information is contained in the configuration file (which you load when the ActiveX component is created), the broadcaster doesn't have to go through the steps of setting up the broadcast environment. All that's left is to start and end the broadcast.

The preceding HTML creates a page with two buttons for starting and stopping the broadcast. There is a text field for entering a URL and a corresponding button that allows you to navigate from the client's browser to the specified URL. There is a <TEXTAREA> with a corresponding button that allows the broadcaster to send "captions" to the client, and there's a button for sending a command to the client that will be handled by a JScript block in the client HTML.

Create the client HTML page

Windows Media Player (WMP) can receive broadcast streams identified in metafiles, which are like catalogs for WMP. These XML-based files detail the streams associated with the metafile and reference information about the stream such as the title, author, and copyright. You must use metafiles in order to use special commands, give useful information about your stream, and create playlist entries.

In this example, I specify some information about my broadcast stream:

<ASX version="3.0">
<TITLE>Basic Playlist Demo</TITLE>
    <ENTRY>
        <TITLE>This is a test.</TITLE>
        <AUTHOR>Phillip Perkins</AUTHOR>
        <COPYRIGHT>(c)2004 Phillip Perkins</COPYRIGHT>
        <REF href="http://ct.cbsi.com/click?q=44-9nSfIZUaiCjHugYPzmHlc3KmnVOl" />
    </ENTRY>
</ASX>

This code defines the broadcast stream. Note the <REF> tag; it provides the address of the stream source for the client. When you set up the Encoder configuration file, you give your stream a name. The name of my stream was Live, which you'll see in the HREF attribute of the <REF> tag.

Here's the client code:

<html>
<head>
<script for="player" event="ScriptCommand(scType, Param)" language="JScript">
alert(scType + ", " + Param);
</script>
</head>
<body bgcolor="black">

<OBJECT id=player style="LEFT: 0px; TOP: 0px;" type=application/x-oleobject
 classid="CLSID:6BF52A52-394A-11d3-B153-00C04F79FAA6" VIEWASTEXT>
    <PARAM NAME="URL" VALUE="http://localhost/Develop/Media/wm_test.wvx">
    <PARAM NAME="rate" VALUE="1">
    <PARAM NAME="balance" VALUE="0">
    <PARAM NAME="currentPosition" VALUE="0">
    <PARAM NAME="defaultFrame" VALUE="">
    <PARAM NAME="playCount" VALUE="1">
    <PARAM NAME="autoStart" VALUE="-1">
    <PARAM NAME="currentMarker" VALUE="0">
    <PARAM NAME="invokeURLs" VALUE="-1">
    <PARAM NAME="baseURL" VALUE="">
    <PARAM NAME="volume" VALUE="100">
    <PARAM NAME="mute" VALUE="0">
    <PARAM NAME="uiMode" VALUE="full">
    <PARAM NAME="stretchToFit" VALUE="0">
    <PARAM NAME="windowlessVideo" VALUE="0">
    <PARAM NAME="enabled" VALUE="0">
    <PARAM NAME="enableContextMenu" VALUE="0">
    <PARAM NAME="fullScreen" VALUE="0">
    <PARAM NAME="SAMIStyle" VALUE="">
    <PARAM NAME="SAMILang" VALUE="">
    <PARAM NAME="SAMIFilename" VALUE="">
    <PARAM NAME="captioningID" VALUE="divCaptions">
    <PARAM NAME="enableErrorDialogs" VALUE="0">
</OBJECT>
<br><br>

<div id="divCaptions" style="
    width: 95%;
    font-size: 10pt;
    font-family: Verdana;
    color: lightsteelblue;
    height: 100px;
    border: 1px lightsteelblue solid;
    padding: 2px 2px;
">
</div>

</body>
</html>

The most important part of this page is the WMP specified by the <OBJECT> tag. There is a number of <PARAM>s associated with this object, but the two most important ones (for my purposes) are the URL parameter and the captioningID parameter. The URL is the address of the metafile, and the captioningID is the ID of the <DIV> element for displaying any captions—divCaptions on this page.

Also note the <SCRIPT> in the page. This script handles the ScriptCommand event of the WMP. When you send a command with the stream, this command will deliver to this event handler. In my script, I simply create an alert box that displays the two parameters passed to the handler. You can use this functionality to create other actions that will occur when the broadcaster sends a special command, such as filling out form data on an associated form.

In my previous example, I gave the broadcaster the ability to send a URL command to the client. At the end of the URL, I concatenated the string "&&frame2". This tells WMP what frame to target the URL. This can come in handy if you have a Web presentation, and you want to progress through a series of exported slides. You'd include this Web page in one frame and put the "slides" in the adjoining frame.

Windows Media 9 Services makes it extremely easy to create custom multimedia streams and broadcasts with little effort. Another benefit is the cost—the Encoder is a free download, and Windows Media 9 Services (the server component) is included as an option in Windows 2003 Server. You can also add Windows Media 9 Services to Windows 2000 Server.

Keep your developer skills sharp by automatically signing up for TechRepublic's free Web Development Zone newsletter, delivered each Tuesday.

Editor's Picks