Home > Articles > Programming > C/C++

  • Print
  • + Share This
This chapter is from the book

Using Task-Specific Background Processing

So far, we haven't actually done any real background processing! We've suspended an application and generated local notifications, but, in each of these cases, the application hasn't been doing any processing. Let's change that! In our final two examples, we'll execute real code behind the scenes while the application is in the background. Although it is well beyond the scope of this book to generate a VoIP application, we can use our Cupertino application from last hour's lesson, with some minor modifications, to show background processing of location and audio!

Preparing the Cupertino Application for Audio

When we finished off the Cupertino application in the last hour, it told us how far away Cupertino was, and presented straight, left, and right arrows on the screen to indicate the direction the user should be traveling to reach the Mothership. We can update the application to audio using SystemSoundServices, just as we did in Hour 10's GettingAttention application.

The only tricky thing about our changes is that we won't want to hear a sound repeated if it was the same as the last sound we heard. To handle this requirement, we'll use a constant for each sound: 1 for straight, 2 for right, and 3 for left, and store this in a variable called lastSound each time a sound is played. We can then use this as a point of comparison to make sure that what we're about to play isn't the same thing we did just play!

Adding the AudioToolbox Framework

To use System Sound Services, we need to first add the AudioToolbox framework. Open the Cupertino (with Compass implementation) project in Xcode. Right-click the Frameworks group and choose Add, Existing Frameworks. Choose AudioToolbox.framework from the list that appears, and then click Add, as shown in Figure 21.5.

Figure 21.5

Figure 21.5 Add the AudioToolbox.framework to the project.

Adding the Audio Files

Within the Cupertino Audio Compass - Navigation and Audio folder included with this hour's lesson, you'll find an Audio folder. Drag the files from the audio folder (straight.wav, right.wav, and left.wav) to the Resources group within the Xcode project. Choose to copy the files into the application when prompted, as shown in Figure 21.6.

Figure 21.6

Figure 21.6 Add the necessary sound resources to the project.

Updating the CupertinoViewController.h Interface File

Now that the necessary files are added to the project, we need to update the CupertinoViewController interface file. Add an #import directive to import the AudioToolbox interface file, and then declare instance variables for three SystemSoundIDs (soundStraight, soundLeft, and soundRight) and an integer lastSound to hold the last sound we played. Remember that these aren't objects, so there's no need to declare the variables as pointers to objects, add properties for them, or release them!

The updated CupertinoViewController.h file should resemble Listing 21.3.

Listing 21.3.

#import <UIKit/UIKit.h>
#import <CoreLocation/CoreLocation.h>
#import <AudioToolbox/AudioToolbox.h>

@interface CupertinoViewController : UIViewController
<CLLocationManagerDelegate> {

    CLLocationManager *locMan;
    CLLocation *recentLocation;
    IBOutlet UILabel *distanceLabel;
    IBOutlet UIView  *distanceView;
    IBOutlet UIView  *waitView;
    IBOutlet UIImageView *directionArrow;
    SystemSoundID soundStraight;
    SystemSoundID soundRight;
    SystemSoundID soundLeft;
    int lastSound;


@property (assign, nonatomic) CLLocationManager *locMan;
@property (retain, nonatomic) CLLocation *recentLocation;
@property (retain, nonatomic) UILabel *distanceLabel;
@property (retain, nonatomic) UIView *distanceView;
@property (retain, nonatomic) UIView *waitView;
@property (retain, nonatomic) UIView *directionArrow;



Adding Sound Constants

To help keep track of which sound we last played, we declared the lastSound instance variable. Our intention is to use this to hold an integer representing each of our three possible sounds. Rather than remembering that 2 = right, and 3 = left, and so on, let's add some constants to the CupertinoViewController.m implementation file to keep these straight.

Insert these three lines following the existing constants we defined for the project:

#define straight 1
#define right 2
#define left 3

With the setup out of the way, we're ready to implement the code to generate the audio directions for the application.

Implementing the Cupertino Audio Directions

To add sound playback to the Cupertino application, we need to modify two of our existing CupertinoViewController methods. The viewDidLoad method will give us a good place to load all three of our sound files and set the soundStraight, soundRight, soundLeft references appropriately. We'll also use it to initialize the lastSound variable to 0, which doesn't match any of our sound constants. This ensures that whatever the first sound is, it will play.

Edit CupertinoViewController.m and update viewDidLoad to match Listing 21.4.

Listing 21.4.

- (void)viewDidLoad {
    [super viewDidLoad];

    NSString *soundFile;

    soundFile = [[NSBundle mainBundle]
                  pathForResource:@"straight" ofType:@"wav"];
                [NSURL fileURLWithPath:soundFile]
    [soundFile release];

    soundFile = [[NSBundle mainBundle]
                  pathForResource:@"right" ofType:@"wav"];
                [NSURL fileURLWithPath:soundFile]
    [soundFile release];

    soundFile = [[NSBundle mainBundle]
                  pathForResource:@"left" ofType:@"wav"];
                [NSURL fileURLWithPath:soundFile]
    [soundFile release];

    locMan = [[CLLocationManager alloc] init];
    locMan.delegate = self;
    locMan.desiredAccuracy = kCLLocationAccuracyThreeKilometers;
    locMan.distanceFilter = 1609; // a mile
    [locMan startUpdatingLocation];
    if ([CLLocationManager headingAvailable]) {
        locMan.headingFilter = 15;
        [locMan startUpdatingHeading];

The final logic that we need to implement is to play each sound when there is a heading update. The CupertinoViewController.m method that implements this is locationManager:didUpdateHeading. Each time the arrow graphic is updated in this method, we'll prepare to play the corresponding sound with the AudioServicesPlay SystemSound function. Before we do that, however, we'll check to make sure it isn't the same sound as lastSound; this will help prevent a Max Headroom stuttering effect as one sound file is played repeatedly over top of itself. If lastSound doesn't match the current sound, we'll play it and update lastSound with a new value.

Edit the locationManager:didUpdateHeading method as described. Your final result should look similar to Listing 21.5.

Listing 21.5.

- (void)locationManager:(CLLocationManager *)manager
       didUpdateHeading:(CLHeading *)newHeading {

    if (self.recentLocation != nil && newHeading.headingAccuracy >= 0) {
        CLLocation *cupertino = [[[CLLocation alloc]
                                  longitude:kCupertinoLongitude] autorelease];
        double course = [self headingToLocation:cupertino.coordinate
        double delta = newHeading.trueHeading - course;
        if (abs(delta) <= 10) {
            directionArrow.image = [UIImage imageNamed:@"up_arrow.png"];
            if (lastSound!=straight) AudioServicesPlaySystemSound(soundStraight);
        } else {
            if (delta > 180) {
                directionArrow.image = [UIImage imageNamed:@"right_arrow.png"];
                if (lastSound!=right) AudioServicesPlaySystemSound(soundRight);
            } else if (delta > 0) {
                directionArrow.image = [UIImage imageNamed:@"left_arrow.png"];
                if (lastSound!=left) AudioServicesPlaySystemSound(soundLeft);
            } else if (delta > -180) {
                directionArrow.image = [UIImage imageNamed:@"right_arrow.png"];
                if (lastSound!=right) AudioServicesPlaySystemSound(soundRight);
            } else {
                directionArrow.image = [UIImage imageNamed:@"left_arrow.png"];
                if (lastSound!=left) AudioServicesPlaySystemSound(soundLeft);
        directionArrow.hidden = NO;
    } else {
        directionArrow.hidden = YES;


The application is now ready for testing. Use Build and Run to install the updated Cupertino application on your iPhone, and then try moving around. As you move, it will speak "Right," "Left," and "Straight" to correspond to the onscreen arrows. Try exiting the applications and see what happens. Surprise! It won't work! That's because we haven't yet updated the project's plist file to contain the Required Background Modes (UIBackgroundModes) key.

Adding the Background Modes Key

Our application performs two tasks that should remain active when in a background state. First, it tracks our location. Second, it plays audio to give us a general heading. We need to add both audio and location background mode designations to the application for it to work properly. Update the Cupertino project plist by following these steps:

  1. Click to open the project's plist file in the resources group (Cupertino-Info.plist).
  2. Add an additional key to the property list, selecting Required Background Modes (UIBackgroundModes) from the Key pop-up menu.
  3. Expand the key and add two values within it: App Plays Audio (Audio) and App Registers for Location Updates (Location), as shown in Figure 21.7. Both values will be selectable from the pop-up menu in the Value field.
    Figure 21.7

    Figure 21.7 Add the background modes that are required by your application.

  4. Save the changes to the plist file.

After updating the plist, install the updated application on your iPhone and try again. This time, when you exit the application, it will continue to run! As you move around, you'll hear spoken directions as Cupertino continues to track your position behind the scenes.

  • + Share This
  • 🔖 Save To Your Account