The HP webOS SDK offers two options for playing audio:

Regardless of which option you choose, you can play audio from a local file on the user's device, or from a remote server. See Supported Audio Formats, below, for details on the protocols and formats webOS supports.

On this page:

Using the webOS Streaming Music Player

To initiate playback of an audio file in the webOS Streaming Music Player app, you launch the app using the Application Manager service, providing the URL (which may be local or remote) of the file or stream you want to play.

Application Manager provides two different methods for launching another application. The open method takes just a URL and uses the MIME type and/or filename extension to determine which application to launch. You may wish to use this method if you're not sure what type of resource (audio or video, for example) your URL represents and you want webOS to choose the right app for playback:

  method: "open",
  parameters: {
      target: ""

The launch method allows you to specify which application to use. As long as you're sure your URL represents an audio resource, this method gives you more control and should result in a faster launch:

  method: "launch",
  parameters: {
      id: "",
      params: {
          target: ""

For details on the use of Application Manager, see the Application Manager reference.


Video and audio playback is not currently supported in the emulator.

Using HTML 5 Audio Objects


webOS 1.4 introduced a change in implementation of the HTML 5 Media API. Developers beginning work on new apps should use the updated API, described on this page. Developers with applications already in the App Catalog should transition to the updated API within the coming months.

For playing audio within your own app, the webOS SDK supports the proposed HTML 5 Media specification. To play audio, you can create one or more Audio objects, call the methods exposed by those objects to control playback, and handle the events generated by those objects as needed.

This document does not cover use of the HTML 5 Audio object in detail, but the HTML 5 Media spec does provide exhaustive detail, and many useful tutorials and walkthroughs may be found online.

Adding an Audio Object

An Audio object is typically added in JavaScript, using the Audio() constructor. In a Mojo Framework context, it often makes sense to do this in a scene's setup() method. For example:

MySceneAssistant.prototype.setup = function() {
  this.bang = new Audio();
  this.bang.src = Mojo.appPath + "/audio/bang.wav";

It's also possible to add an audio object by including the <audio> element in your HTML markup, either in your index.html file or in the view markup for a particular scene:

<audio src="" id="myAudioElement"></audio> 

Note, however, that webOS does not currently support the controls attribute described in the HTML 5 proposal, so if you want to provide playback controls, you'll need to add UI elements yourself and use the methods exposed by the Audio object to control playback.

Loading the MediaExtension Library

webOS adheres very closely to the HTML 5 Media spec, but there are some cases (described below) in which it's necessary to augment the specified functionality. For these cases, webOS provides the MediaExtension library.

Currently, cases that require use of the MediaExtension library include:

To use the MediaExtension library, you first need to include MojoLoader in your application by adding the following line within the <head> section of your app's index.html file:

<script src="/usr/palm/frameworks/mojoloader.js" type="text/javascript"></script> 

Then, generally within a scene assistant, you load the MediaExtension library and use it to instantiate a MediaExtension object for any Audio or Video object that may require one:

// Load the MediaExtension library
this.libs = MojoLoader.require({ name: "mediaextension", version: "1.0"});

// If you don't already have one, get a reference to the media element, using its ID
this.mediaObj = this.controller.get("myMediaElement");

// Instantiate the MediaExtension object
this.extObj = this.libs.mediaextension.MediaExtension.getInstance(this.mediaObj);

Setting an Object's Audio Class

To ensure a good user experience, webOS automatically manages some aspects of audio and video playback. For example, when a webOS device receives a phone call or triggers an audible alert, it automatically mutes, reduces the volume of, or pauses any other audio or video that may be playing.

In order to do the right thing, webOS needs to know something about the nature of each media object that your application plays. You can provide webOS with the information it needs by setting an object's audio class.

webOS supports a number of audio classes at the system level, but for applications there are only two classes that commonly apply: audio and video content should be assigned the media class, while application sounds (e.g. sound effects, UI cues, etc.) should be assigned the defaultapp class.

webOS assumes the defaultapp audio class unless you specify otherwise, so in practice you generally only need to set the audio class for objects that should be classified as media.

An Audio or Video object whose audio class is set to media behaves as follows:


When webOS pauses or resumes playing a particular object, the object will fire a pause or play event, per the HTML 5 Media specification. Your application should listen for these events and update any playback controls your UI may provide (and perform other operations as needed). See Handling Media Events, below.

To set an object's audio class, you first need to load the MediaExtension library and obtain a MediaExtension object, as described above. Once you have obtained a MediaExtension object, it's simple to set the audio class:

// this.extObj is a MediaExtension object associated with
// the Media object whose audio class we want to set
this.extObj.audioClass = "media";

Controlling a Media Object

A detailed discussion of controlling HTML 5 Media objects is beyond the scope of this document, but this section provides a high-level introduction to controlling video objects within a webOS application. For more information, please refer to the HTML 5 Media specification or reference and tutorial resources available from other sources.

Playing, Pausing and Seeking

Per the HTML 5 Media spec, media objects expose play() and pause() methods, and a currentTime property for seeking to a particular point in the audio. You can use these (along with the object's other methods and properties) to control playback.

The following example illustrates how you might control media playback in response to button taps in your UI (assuming you are listening for taps on each button and have registered the following methods as handlers):

MySceneAssistant.prototype.handlePlayButtonTap = function() {;

MySceneAssistant.prototype.handlePauseButtonTap = function() {

MySceneAssistant.prototype.handleRewindButtonTap = function() {
  this.mediaObj.currentTime = 0.0;

Determining Whether a Media Object is Pausable

Some media streams that use the RTSP protocol are not pausable. You may need to adjust your application's playback UI or logic in this case, so webOS provides a mechanism for checking to see whether an object is pausable.

You first need to load the MediaExtension library and obtain a MediaExtension object, as described above. Once you have obtained a MediaExtension object, you can check for pausability as shown here:

// this.extObj is a MediaExtension object associated with
// the Media object whose pausability we want to check
if (this.extObj.pausable) {
  // Adjust UI and app logic accordingly...

Handling Media Events

HTML 5 Media objects fire a variety of events to indicate state changes. You can listen for these events and respond as appropriate within your application.

As noted above, under certain circumstances webOS may automatically pause and resume playback of objects whose audio class you have set to media. When this occurs, the affected object will fire a pause or play event, which you should use to trigger UI updates and any other operations your app may need to perform.

The following example illustrates how to listen for and respond to these events:

MySceneAssistant.prototype.setup = function() {
  // Load the MediaExtension library, required to set audio class
  this.libs = MojoLoader.require({ name: "mediaextension", version: "1.0"});

  // Get a reference to the media element, using its ID
  this.mediaObj = this.controller.get("myMediaElement");

  // Get the MediaExtension object and set the audio class
  this.extObj = this.libs.mediaextension.MediaExtension.getInstance(this.mediaObj);
  this.extObj.audioClass = "media";

  // Listen for pause and play events
  this.mediaObj.addEventListener("pause", this.handlePause.bind(this), true);
  this.mediaObj.addEventListener("play", this.handlePlay.bind(this), true);

MySceneAssistant.prototype.handlePause = function(evt) {"received pause event");
  // Update UI, etc.
MySceneAssistant.prototype.handlePlay = function(evt) {"received play event");
  // Update UI, etc.

Audio Performance Tips

If your app plays multiple short sounds (as effects or UI cues, for example), the following tips will help to maximize performance.

Audio Object Garbage Collection

Be aware that holding references to many Audio objects simultaneously can affect your application and system performance due to memory constraints. At some point audio may stop functioning requiring a device reset.

When you allocate a new Audio object in your app, i.e., by using new Audio(), a JavaScript object will be returned which will be garbage collected when you no longer have any references to it in your code. Therefore a best practice is to make sure that you do not hold references to unneeded Audio objects.

Also keep in mind that when you add an Audio object by including an <audio> element in your HTML markup garbage collection will occur when the scene is popped. For example when using this form:

<audio src="" id="myAudioElement"></audio>

the Audio object associated with this HTML element will be garbage collected when the corresponding scene is popped.

Supported Audio Formats

This section lists the supported audio formats and recommended streaming protocols.

Supported Formats

The following are the supported audio formats:


The following table lists the recommended encoding settings for streaming.

Protocol Bandwidth Recommended Supported
HTTP progressive download High 64 Kbps, AAC+, 44 KHz, stereo. All local formats are supported.
Low 24 Kbps, eAAC+, 44 KHz, stereo. All local formats are supported.
Real time streaming protocol (RTSP) Low 24 Kbps, eAAC+, 44 KHz, stereo. AAC or AMR
SHOUTcast and Icecast High 64 Kbps, AAC+, 44 KHz, stereo. AAC or MP3
Low 24 Kbps, eAAC+, 44 KHz, stereo. AAC or MP3

Volume Key Lock

The lockVolumeKeys method of the audio media player service allows the device's volume keys to adjust the media volume. Call this when your application requires the volume keys to adjust media volume rather than ringtone volume.

The typical use case would be when a media-based application is active, but media is not currently playing. In this case, the user expects volume keys to control the media volume setting. However, by default, if media is not playing, the volume keys still control the ringer volume.

Here's an example from the Music Player app:

markAppForeground: function(callback) {
  var parameters = {};
  parameters.subscribe = true;
  parameters.foregroundApp = true;
  return new Mojo.Service.Request(
          method: 'lockVolumeKeys',
          onSuccess: callback,
          parameters: parameters

This allows the user to change the volume level even though a song isn't currently playing.

See also: