If we were all chickens, it might be easier to design interfaces for wearables devices. So said Jenny Murphy, developer programs engineer at Google, in her keynote address at the Wearables Developer Conference this morning in San Francisco.

When Google first began developing Glass, she said, the focus was on head gestures as a method of user interaction. But as the Glass prototypes were tested, it became apparent that head movements in humans do not always track their focus.

Unlike chickens. Murphy showed a video to demonstrate how chickens and other birds hold their heads perfectly still, even as their bodies move around it. For humans, however, standing up and walking out of the room could yield multiple head movements that are read by Glass as false positives.

(Related: Behind Wearables DevCon)

This highlighted the last of five principles Murphy described in her keynote. This last principle, that of designing for people, casts a spotlight on an interface factor that really hasn’t come into play for computer scientists until recently.

“You can tell a lot about the kind of being that uses Glass by looking at Glass. It reflects our anatomy as people. You definitely want to keep this in mind when writing glassware,” said Murphy.

The other four principles Murphy discussed bolstered the idea of designing for human users rather than for mouse clicks and finger jabs, which could conceivably come from any old life form. Instead, she said, Glass and other wearables have to take into account the numerous human-style things that users encounter when they are computing in the real world, instead of at a desk or in a chair.

The first of these principles was to design for Glass. Murphy said this can manifest in the manner of interface your Glass application uses. Thus, designing for Glass means taking into account the transparent display used in the device.#!
“The same color can look completely different inside versus outside,” said Murphy. “You cannot rely on colors being reproducible or even distinctive due to light levels. Monochromatic designs tended to work much better because you ended up with a much more consistent experience.”

The next principle was don’t get in the way. Murphy explained that users of Glass should remain immersed in the world around them, not in Glass itself. That means restricting user interfaces to simple interactions instead of complex menus with lots of options.

“When you develop a piece of software for phone, laptop or tablet, if the user is completely sucked into your software you’ve done something right,” said Murphy. “But with Glass we found just the opposite was true. If the user was completely focused on the software they’re using, they’re not focused on the rest of their life. And that’s what wearable computing is about: The whole world is your canvas.”

Next on the list was to keep it relevant. This meant utilizing Glass’ built-in features, such as GPS and wireless connectivity, to present applications and their information automatically when needed. Murphy used the example of a shopping list automatically being displayed on Glass when a customer arrives at the supermarket.

But the real core of Murphy’s talk was focused on the importance of prototyping. She showed an image of the original prototype of Google Glass, which was constructed from a Pico Projector, wire hangers, some translucent folders, and binder clips. This prototype was built in an hour and allowed the team to wear a computer and understand what exactly this hardware would feel like to use.

Though this initial prototype was the size of a laptop, it did show the team very quickly that they would need to rethink a lot of display conventions due to the strange nature of translucent screens.

And this is what Murphy advocated for all those in attendance: rapid prototyping. She said that the secret to designing wearable computing devices, and indeed to all skills, is to get your hands dirty and to create, then revise, iterate and improve. The faster you prototype, the sooner you learn the constraints of the problem space, she said.

“I encourage you, no matter what platform you’re building for, get code down,” said Murphy. “Build it, use it in your life. Use your prototypes at home, and at work, at the supermarket, with your friends, and you’ll learn a whole lot about what makes software for Glass, and wearables in general, really useful.”#!
Wearables, sleep and Big Data
Philippe Kahn, founder of Borland and more recently founder of wearables company Fullpower Technologies, admonished attendees in his keynote to ensure that their wearable devices be non-invasive. His company has been focusing on monitoring sleep in its users through the use of an armband connected to the Internet.

Kahn said that wearables can make it difficult to measure and quantify things, unless they are non-invasive. “We have a hard time quantifying what we do. It’s hard because it is tied to something that’s very important to wearables, which is Heisenberg’s Uncertainty Principle. The observer of the phenomenon changes the outcome,” he said.

He demonstrated this by showing images of sleep-monitoring devices used by physicians and researchers. One such system required shaving the user’s head and attaching 32 electrodes to their face and skull. Others were bulky facemasks that rose up to a half-foot off the wearer’s face.

Kahn asked if these intrusive devices actually measured people’s sleep patterns and behaviors, or if they, in fact, measure people’s reactions and attempts to sleep in a stressful and uncomfortable environment. He argued that it was the latter. But that being said, he did not believe that sleep would turn into a soft target for developers anytime soon, even with nonintrusive armbands for measuring its effects.

“The notion of being able to know something as complex as sleep and turn it into algorithms is something that is not going to happen,” said Kahn. “This kind of ‘I am going to come up with an algorithm that is going to describe completely this problem and is going to come up with the answer,’ revelationist approach to engineering doesn’t work well because the world where we are is based not just on how smart we are, but also on interpreting what we learn from Big Data and making it part of our algorithms in a very intelligent way. It’s an enormous feedback loop.”

And while sleep itself may not be reduced to an algorithm anytime soon, the things we can learn from sleep habits are only just now becoming apparent thanks to Big Data analysis at the wearables firms tracking the problem.

Eli Bressert, data scientist at Jawbone, showed off some of the information his team gathered from its analysis of aggregate sleep data from users of its UP bracelet. He said that significant events contributed to sleep loss in large populations, and that these effects were often regional.

How regional? The city of Baltimore collectively lost 1 hour of sleep the night the Ravens won the Super Bowl. Washington DC lost its own sleep time on the night of President Obama’s second inauguration.

But the biggest danger to sleep, nationwide, is something we all just went through: daylight savings time results in not only a whole hour vanishing in the spring, it results in a lot of less productive workers the Monday after it happens.

What to do with these wearables?
Wearables Developer Conference was packed with interesting ideas and gadgets, but not everyone knew what to do with them just yet. Items like the Emotiv EPOC headset, the Boogio clip, and a host of development boards from Freescale, Intrinsyc, Sensoplex and Xensor were all demonstrated at the conference. But for attendees, the focus was on brainstorming uses for these devices, rather than on stocking up for a specific use case.

That doesn’t mean there weren’t plenty of examples of how to use this equipment. uTest discussed its adventures in testing wearable and GPS-enabled devices. The crowdsourced testing company described a scenario in which testers had to drive back and forth across the Mexico-USA border to test a GPS system.

Boogio, on the other hand, envisions its clip-on wearable sensor device as a way to bring people’s feet into software. By clipping one Boogio on each shoe, users can track their movements. The company demonstrated this device by showing off a game where players dodged obstacles by stepping left or right.