(X) Hide this
    • Login
    • Join
      • Generate New Image
        By clicking 'Register' you accept the terms of use .
        Login with Facebook

Windows Phone 7 for Silverlight Developers: Input on the Phone

(4 votes)
Silverlight Show
>
Silverlight Show
Joined Nov 03, 2008
Articles:   22
Comments:   29
More Articles
0 comments   /   posted on Jul 14, 2011
Categories:   Windows Phone 7

SilverlightShow has the honour to publish an entirely new ebook by the 4-time awardee of a Silverlight MVP title András Velvárt: "Windows Phone 7 for Silverlight Developers". Below and in a next article you may find free parts of the Design Considerations chapter of this ebook. 

A full overview of the topics covered in the eBook is available here. The ebook is available for instant download.


About this eBook

From the author: Microsoft brought two of its most interesting development frameworks to the phone: XNA and Silverlight. Silverlight developers can also right at home with Windows Phone 7. It is said that if you are already a Silverlight developer, you already know 90% of what is needed for developing Silverlight applications for the phone. This e-book is about the remaining 10%, written by Silverlight MVP and author of the WP7 app “SurfCube 3D Browser”: András “@vbandi” Velvárt.

Note: this e-book covers Windows Phone 7.0 development. Footnotes have been added where Mango (the upcoming update of Windows Phone 7) is different, but the Mango API is not detailed here.

 

A Windows Phone 7 device has a number of ways to receive data from the outside: the proximity sensor, light sensor, compass, the camera, the accelerometer, microphone and of course the touch screen. At the moment, your application only has direct access to the latter three - accelerometer, microphone and the touch screen. Hopefully, the other sensors will be available soon.

Note: You can access the camera by invoking the built-in camera app and asking the user to snap a picture. After confirmation, you have access to the image he / she took. However, with V1.0 of the WP7 SDK, you cannot access the raw camera output and use it for augmented reality, barcode recognition or other similar applications.

The accelerometer[1]

As its name implies the accelerometer measures acceleration along three axes as shown on Figure 13.

Figure 13 - the accelerometer axes

For one axis, the acceleration is the sum of two key forces affecting your device. One such force is the gravity. If you put the phone on a table, you will get approximately 0 values for the X and Y axes, while the Z value will be around -1. If you flip the phone with its screen pointing down, the Z value will be +1. By reading the accelerometer, you can essentially tell whether the phone is in a horizontal or vertical position, and which way is down. There are even some car racing games where you can steer your car by turning your entire phone like a steering wheel.

The other component of the acceleration is the actual force you create when moving your phone. For example, if you quickly lift the phone from the table, the acceleration of your lifting will be added to the Z value, and you can reach 2Gs for a brief period of time.

Getting the accelerometer working is fairly simple - you create an instance of the Microsoft.Devices.Sensors.Accelerometer class, subscribe to its ReadingChanged event, and get it started using the Start() method.

The accelerometer itself is a bit noisy - don’t expect to get back the same value twice, even if the phone itself is perfectly still. It gives you a data at a rate of about 50Hz, so make sure you stop it when you no longer need the date to save the battery. Doing a decent analysis of the accelerometer data can be pretty difficult (depending on your task), but luckily Microsoft has provided a helper class - AccelerometerHelper - that gives you four efficiently written filters for handling accelerometer data. (The AccelerometerHelper class is part of the Level Starter Kit at http://create.msdn.com/en-US/education/catalog/sample/level_starter_kit.)

The AccelerometerHelper class works similarly to the Accelerometer class in the SDK: create a new instance, sign up for the ReadingChanged event, and activate the meter by calling the StartAccelerometer method. You can also do calibration by calling the Calibrate method. The big difference is in the event arguments you get in the ReadingChanged event’s arguments. You now have access to four fields in the AccelerometerHelperReadingEventArgs class (each of them are of type Simple3DVector):

  • RawAcceleration - gets you the unfiltered, original data directly from the sensor. If you need the maximum possible speed and the minimum possible reaction time, you will need this one.
  • LowPassFilteredAcceleration - returns a filtered data using a low-pass filter, eliminating the sensor noise. The output is a very smooth signal, but you can experience a fair bit of latency.
  • AverageAcceleration - collects and averages samples for half a second, resulting in a high latency, but stable signal.
  • OptimallyFilteredAcceleration - this is the ideal choice for fast reacting UI - most of the sensor noise is filtered out, but has a very low latency, and thus you can use it for directly controlling items on the screen.

The Microphone

The microphone allows the application to work with audio input from the environment. To access the microphone, you have to use some XNA libraries, even if you are working with Silverlight. Remember, an application can mix XNA and Silverlight APIs, but the display can only be driven by one or the other framework during the lifetime of the app.

In order for the XNA libraries to work, you have to simulate the XNA game loop, which is essentially the clock of an XNA application. To do this, invoke the Microsoft.XNA.Framework.FrameworkDispatcher.Update() method periodically. This can be done via the following code:

DispatcherTimer dt = new DispatcherTimer();

dt.Interval = TimeSpan.FromMilliseconds(50);

dt.Tick += delegate { try { FrameworkDispatcher.Update(); } catch { } };

dt.Start();

Now all is set to use the Microsoft.XNA.Framework.Audio.Microphone class. It is fairly straightforward to use: the Start and Stop methods begin and end recording, which is reflected in the State property. You can access the recorded data via the GetData method (to be called in the BufferReady event handler), which gets you a byte array. To make sense of the data, you can use the GetSampleSizeInBytes method and the SampleRate property.

You can also play back the recorded audio using the SoundEffect class in the same namespace, which even allows you to change the pitch or the pan of the sound in the stereo space.

Touch Input

The touch screen is the most direct input method of all. Every Windows Phone 7 device is required to have a high resolution touch screen, capable of detecting and tracking at least four touch points simultaneously.

Touch events are translated to mouse events (MouseLeftButtonDown, MouseLeftButtonUp, MouseMove). This keeps compatibility with existing Silverlight code from desktop, and also makes it easy to handle single-touch scenarios.

Note: Some controls have been modified for multitouch. For example, the Button only fires the Click event when all fingers have been lifted from it. If you touch the button with a finger, put down another finger, then lift the first one, the Click event will not fire until the last finger has been lifted from the Button (if ClickMode is set to Release).


The Windows Phone 7 SDK also gives you the ability to detect manipulations (gestures). UIElement (the base for most classes that have a visual representation in Silverlight) defines the ManipulationStarted, ManipulationCompleted and ManipulationDelta events. While these events also exist on desktop Silverlight, they are much more important on the phone because most of the user interaction is performed via touch. The ManipulationXXX events can give you information on how a UIElement is dragged or rescaled (via a pinch gesture) by the user.

The third way to get access to touch information is by using the Microsoft.XNA.Framework.Input.Touch.TouchPanel class. Since this class does not deal with screen output, it can be used with Silverlight apps as well. The TouchPanel.GetState() method returns a collection of TouchLocation objects. Each TouchLocation corresponds to one finger touching the screen, and has an ID (to differentiate it from other touches), a Position and a State (the latter indicates whether the TouchLocation is Pressed, Released, Moved or Invalid). Be aware however, that the TouchLocation objects know nothing of the Silverlight elements - thus, their Position is always relative to the entire screen.

There is a fourth approach to handle touch gestures. The Silverlight Toolkit for Windows Phone 7 has the GestureListener class to help with advanced gesture recognition. GestureService.GetGestureListener(DependencyObject) returns a GestureListener, which provides events for Tap, DoubleTap, Drag, Flick, Hold and Pinch gestures on the specified object.

Note: Design considerations about touch

While the mouse pointer can be precise to the pixel, touch is a very imprecise form of input. Therefore, it is recommended that any touch-sensitive control should be at least 9x9 mm in size. The built-in controls already take this into account using the PhoneTouchTargetOverhang and PhoneTouchTargetLargeOverhang system resources.

It is also important to note, that the user’s hand can easily cover the touched object (and half the screen with it). So, if you want to provide feedback on registering the touch - which you should - make sure that the feedback is large enough to be visible. You can use the Tilt Effect from the MSDN article “Tilt Effect Overview for Windows Phone” to achieve this (http://msdn.microsoft.com/en-us/library/ff941094(v=vs.92).aspx).


[1] Mango hardware can also have gyroscopes, which was not supported in Windows Phone 7. To hide the complexity of using the gyroscope, compass and the accelerometer together, Microsoft has created a sophisticated algorithm that exposes a virtual Sensor API, collecting the data from all of these sensors. This Sensor API will work the same way if a phone doesn’t have a gyroscope, it will only be less precise and fast.


Subscribe

Comments

No comments

Add Comment

Login to comment:
  *      *       
Login with Facebook