Advances in mobile hardware technology, as well as the adaptation of mainstream computer software for mobile devices, have blurred the lines between laptop computers and tablets. Mobile devices, however, include capabilities never before imagined on desktop or laptop systems – the ability to detect and react to motion.

The latest iOS devices are equipped with sensors designed to identify and report motion. All fourth-generation iOS devices, for example, come equipped with a single gyroscope and a cluster of three accelerometers for constantly tracking movement. The motion information is made available to your iOS app through the use of motion events. As a user moves or rotates a device, the Core Motion framework processes the information and communicates through notifications.

A brief history

Mobile devices have become part of our everyday lives. The line between desktop and laptop computers is easy to understand. Old-school users remember a time when it was not feasible to run processor-intensive applications on anything other than a desktop system. Conversely, a desktop computer would never be considered for a road-trip. The main advantage of a laptop computer to a student or business traveler is simple – it’s mobile. There seems to be a strong trend of users choosing a tablet or smartphone as a full-time replacement to his or her laptop computer. It’s simply part of technology evolution.

Until recently, tablets and laptops fell into two distinct user-functionality classifications. There was as much of a gap between laptops and tablets as there was between desktop systems and laptops. The main difference being that the performance capabilities of an early laptop could never compare to the power and expandability of a desktop system. This is certainly not the case as we compare a laptop to a tablet, or even a tablet to a smartphone. Some of the features once unique to a particular class of device are now available on all devices. Touch screen technology, for example, can be found on desktop computers, laptops, tablets, and smartphones alike.

Motion sensing technology, however, remains unique to mobile devices.

As a type of input, motion-sensing technology allows a user to become fully immersed into a gaming app. Instead of a joystick, or other tethered input device, a user rotates, tilts, and pivots the tablet or smartphone to control certain aspects of a game.

Other uses of motion detection include the more commonly used current device orientation property available within the UIDevice class. Developers are often tasked with maintaining two orientation-specific interface layouts – one for portrait and one for landscape. In certain cases, as with the Apple’s iOS Calculator app, the functionality also changes. The built-in calculator on the iPhone changes from a standard calculator in portrait mode to a scientific calculator in landscape mode (Figure A).

Figure A

Detecting changes in device orientation

Motion events can be either pushed to the app through the use of notifications, or requested as needed. In both cases, a first responder object will need to be created to handle the task. Tracking the orientation of an iOS device does not require the use of the Motion Events framework. The UIDevice class is sufficient for tracking and handling general orientation changes and it works on all devices running iOS 2.0 and above.

The seven possible values of the orientation property described in the UIDevice Class Reference are:

  • UIDeviceOrientationUnknown – The exact orientation of the device cannot be detected.
  • UIDeviceOrientationPortrait – The device is in portrait mode, with the home button at the bottom below the screen.
  • UIDeviceOrientationPortraitUpsideDown – The device is in portrait mode, with the home button above the screen.
  • UIDeviceOrientationLandscapeLeft – The device is in landscape mode, with the home button on the right.
  • UIDeviceOrientationLandscapeRight – The device is in landscape mode, with the home button on the left.
  • UIDeviceOrientationFaceUp – The device is held perpendicular to the ground, with the screen facing upwards.
  • UIDeviceOrientationFaceDown – The device is held perpendicular to the ground, with the screen facing down.

You prepare your iOS app to handle changes in a device’s orientation by (1) initializing the accelerometer and (2) setting up notification events (Listing 1.1). This is typically done within the UIViewDidLoad method of your app.

Listing 1.1

-(void) viewDidLoad(){
[[UIDevice currentDevice] beginGeneratingDeviceOrientationNotifications];
[[NSNotificationCenter defaultCenter]

Once initialized, every change in the device’s orientation triggers a notification. Information is passed to the method identified in the selector property (in this case: orientationDidChange). Information regarding the notification will be passed to the custom method (Listing 1.2). The notification simply indicates that there was a change in the device’s orientation. You can retrieve the current orientation by referencing the orientation property: [[UIDevice currentDevice] orientation].

Listing 1.2

-(void) orientationDidChange:(NSNotification *)notification{
// Get the current orientation
UIDeviceOrientation orientation = [[UIDevice currentDevice] orientation];
// Handle any device orientation changes

Six of the seven possible values returned when polling the current orientation (listed above) represent an opportunity for your app to respond. You may want to only respond to changes in landscape or portrait mode. In other words, you could add code to ignore all orientations except portrait and landscape (Listing 1.3).

Listing 1.3

// If any of the unwanted orientation values are returned, exit the function
if(orientation == UIDeviceOrientationUnknown
|| orientation == UIDeviceOrientationFaceUp
|| orientation == UIDeviceOrientationFaceDown){

Detecting shake-motion events

Another approach to detecting and responding to motion events involves the use of the UIEvent class. Detecting shake-motion events does not require the Core Motion framework. In fact, any iOS app that only needs to know the status resulting from a motion event (i.e. orientation), the Core Motion framework is unnecessary overhead. A shake-motion event has been used to trigger everything from “erasing the screen” to “rolling the dice.” When a user shakes an iOS device, accelerometer information is evaluated to determine if a legitimate shaking gesture has occurred.
To handle motion events, you need to implement at least one of the two motion-handling methods: motionBegan:withEvent or motionEnded:withEvent. Most commonly, the motionEnded:withEvent method is used to receive notifications at the conclusion of a motion event. To identify the event as a motion-shake event, check the UIEventSubtype value (Listing 1.4).

Listing 1.4

-(void)motionEnded(UIEventSubtype)motion withEventUIEvent *)event{
if(motion == UIEventSubtypeMotionShake){
// We just detected a motion-shake event

Final Thoughts

Motion tracking is a powerful feature of iOS. The Core Motion framework makes it easy to receive notifications of a motion-related event. It is up to your app to determine the type of event, and how to respond. The Core Motion framework also provides access to the raw accelerometer and gyroscope information. This is particularly useful for apps that require real-time information. For most uses, however, the UIDevice and UIEvent classes provide enough information to handle device orientation and shake-motion events.

Also read: