Monday 17 September 2012

Chasmic Angst

The tragicomic outpourings over the iPhone 5 continue, with the entirely unsurprising record preorders being taken as final proof of the sheep-like nature of iPhone users or the unquestioned supremacy of the device to the faithful depending on which zombie army you hear. This comes on top of howls of pain from techie fanbois seemingly expecting everything from projectors to time travel, and the hoots of derision from Android users who've been taking panoramas for a year now and rightfully take exception to claims of novelty. Perhaps saddest of all are pundits (and I'm looking at Robert Scoble here) diluting their equity by attempting to talk up the unremarkable, although the fanbois attempting to justify both the proprietary nature and prices of Lightning cables and adaptors may come bottom equal.

All this noise is drowning out the silent entry to a new era. We may, finally, have reached the infamous and quite possibly mythical Year of Mobile, although it would probably be safer to stick that label on 2013.

This is because it would seem that Apple has now crossed Geoffrey Moore's chasm. The fact that the device is boring, has the same Fisher Price interface as four years ago and doesn't do anything new or interesting is what they like about it.  Early majority users now feel comfortable splashing out on one of these devices in the misguided view that it's a stable, safe purchase.

Large numbers of normob users will have a dampening effect on genuine innovation so the Apple fanbois had better get used to it. Of course life would be easier for them if they hadn't spent the last few years being objectionable to all other mobile users and if Apple stopped claiming originality where none lies.

Thursday 13 September 2012

Photoshop Fixation

Yesterday's Apple launch of a device with a slightly larger screen has lead to howls of horror from graphic designers and a part of the app dev community. Oh no, they cry, not another set of screen dimensions for which we need to make a separate UI!

Sigh.

This kind of comment has been going on forever. Before mobile it was desktop apps, and it has always existed on the web where many designers prefer to work to some imaginary fixed size than make the most of the available screen. I feel like I've been fighting the same battle over and over since the 1980s. I even completely automated it for the Java ME world of smart and feature phones that existed before the iPhone and Android.

I can't resist saying this again, sorry. When the iOS SDK first came out I looked in vain for the API that returned screen dimensions. Checking the sample programs they all used literal hardcoded numbers, not even proper constants. I really was shocked, but apparently the powers that were had decided that 320x480 was what the masses wanted, now and forever.

And so Photoshop became the mobile UI prototyping tool of choice.

To understand why this is so wrong it helps to go back to the basics of art training. A good knowledge of anatomy - bone structure and musculature - is fundamental to being apply to paint, sketch, draw or sculpt people. Artists have to understand what's going on below the skin to make their work look good. The same is true for designers: prettiness is not everything, it's got to hang on the bones and move with the muscles.

At least this time round I'm not the only person wanting people to design adaptively. The responsive design and progressive development movements have understood that relative design and intelligent adaptive coding can not only address a few dozen pixels here and there but create optimal user experiences across widely varying screen resolutions.

The rest of you had better take heed and get with the program: there are more and more form factors coming to market. Making your pretty face fit the head it appears on isn't that difficult. Just get on with engineering it.

Tuesday 11 September 2012

Are comments the death of debate?

Historically most people spend time with like-minded friends and have limited exposure to those of opposite views. Even when we do meet someone with opposite views there is body language and societal rules about how to behave and talk with them. When this doesn't happen we have other words like riot or lynch.

On line things are different. While there are sites where like-minded people can pat each other on the back and support their points of view, there are plenty of sites that at least try and address topics where multiple views may be help. Technology news is a classic.

The internet has long been a safe haven for trolls who love to wind others up or simply don't have the social skills to see what they are doing. However we seem to me moving into a time where opinion has become a matter of faith. Faith, as we all know, is not based on fact. Faith is usually undermined by fact and destroyed by reasoned debate.

In an ideal world unfettered commenting on articles would be a great public debate, leading to shared enlightenment and harmony. Unfortunately most of them dissolve into a brawl. Just look at Techcrunch comments on anything mobile, or any posting even vaguely related to American politics. They all lead to disproportionate, unseemly outpourings of hate. Eventually that will be enough to prevent many people from discussing a topic, which may well be what the haters want. Not the suppression of debate, although some will, but not to have their views challenged.

Monday 10 September 2012

Mobile OSes and the UNIX Workstation Wars

Last week I wrote about tech culture having shorter and shorter memories. When an old friend reminded me of the Apollo Domain workstation, one of the first proper graphical workstations on the market back in the 1980s, I was struck by the resemblance between the current mobile market and the workstation market then.

Back then PCs were slow things that accountants used for spreadsheets and on which secretaries typed up documents in Word Perfect, not realising that their whole profession was soon to vanish. Serious minded people with engineering degrees and beards needed the oomph of a Graphical Workstation.

Apollo's Domain was one of the last proprietary operating systems as UNIX spread. Indeed, if you go back to that seminal work on technology entrepreneurship, Gordon Bell's High-tech Ventures, securing a UNIX license is one of the steps any startup must go through.

UNIX was everywhere, except that everyone had their own "added value" version and silly name. IBM with AIX, HP with HPUX, Sun with SunOS then Solaris, Digital with Ultrix, ICL with PNX for the short-lived Perq product. Each one vaguely similar yet irritating different.

Over in the world of PCs there was Windows and a tiny number of people using Macintoshes. CP/M was already dead and nobody in their right mind was making new operating systems until some bright spark came up with the idea of an open-source version of UNIX.

Strikes me that the world of mobile is very similar - each company has it's own variant on a theme, all with some kind of irritating "added value". What can we learn from this? While most of those old UNIX versions still linger on in high-value data center systems, the workstation market has vanished, leaving a world with Windows, Macintosh and a smattering of Linux amongst techie people. Hardware makers went back to doing hardware and just licensing software, limiting their efforts to littering their products with useless extras. The consumer and non-specialist market prefers something it recognises.

RIM and its efforts to make something out of BB10 remind me most of Apollo and Domain. Brilliant at the time, but eventually succumbed to the lower-cost, standardised UNIX layer. Acquired and assimilated into HP in the end. 

Friday 7 September 2012

The Network is the Risk

John Gage of Sun Microsystems, now part of Oracle, famously came up with the slogan The Network is the Computer to usher in a new style of computing. He was describing the UNIX workstation paradigm. Machines connected by a local network cable and using a central file server. This architecture, which is now taken for granted, ushered in a whole new range of threats and risks.

Fast forward a couple of decades and network is largely synonymous with Internet that risk has become a whole lot bigger. Faster forward again to today and both network and Internet are synonymous with wireless, ubiquitous connectivity. Each stage widening that range of threats and risks.

It is therefore time to introduce a new aphorism:

The Network is the Risk

People serious about information security still keep discrete computers discreet in metal-lined tents. Enterprise IT departments keep trying to lock their data away in key-card protected data centers, but it keeps leaking out into mobile devices of all sorts. That network connection is now the biggest risk CIOs face.

Wednesday 5 September 2012

Shorter and shorter memories?

Just as the memory capacity of portable devices explodes are our cultural memories getting smaller? While software and computing have continually reinvented themselves, it seems that the memory cycle is getting shorter and shorter and narrower and narrower.

Maybe this is an aspect of the increasing complexity of technology where nobody can know the entire stack from circuit to UX. Yet at the same time that technology makes it easier and easier to decipher.

This came to mind reading an article about QR codes where the author clearly had done no research on the topic at all. A few quick searches and he would have learned that the idea of cameras always being sensitive to QR codes has long existed in Japan and other markets. As it is his piece comes over as, in the words of the social media generation, an epic fail.

Looking at the burgeoning consumer app market you can find loads of examples of people trying to reinvent concepts that failed five years ago for good reasons. Unfortunately adequate technology was rarely one of them. This could explain the rather depressing stream of emails telling me about services I'd forgotten signing up to were closing.

But one area where we really, really should remember all the failed systems in the past is in enterprise integration. Nobody has an excuse anymore, technical or business, for thinking that whatever they are building will not have to connect to other systems or to new end point devices. Yet this happens again and again.

So people, please check your history, plan for the future, and only then start to build.

Monday 3 September 2012

Are development tools always playing catchup?

Older people will know that the item in the picture is a Stanley Yankee pump screwdriver. You carefully slotted the bit into the screw and then pressed the handle, hard. The ratchet turns and you drive the screw into the work. It requires strength and accuracy as those things were quite long.

Of course yankees have long been discarded in favour of their electric descendants and rightly so. An electric screwdriver gives more power, longer, without quite so much accuracy, and often comes with torque control to boot. Power tools in general have represented a huge jump in operator performance, ranging from chainsaws to minidiggers and backhoes through to Dremels.

However the jump from Yankee to electric screwdriver is nothing like as big as the jump from a traditional screwdriver to the Yankee. And unlike the jump in productivity from first steam shovels over an army of navies, sometimes a couple of guys with a shovel are more effective than a minidigger.

While software development tools have moved on from when we punched codes into cards and tapes, they are still largely recognisable. We have seen lots of fantastic productivity aids like autocompletion, library look up, static checkers on top of simply having more screen space. However have we seen the huge quantum leaps like the steam shovel?

Proponents of 4GL tools would say they have, but, like a steam shovel you still need people round the edges to finish the job. They'll do fine if you don't mind software that is the equivalent of raw shuttered concrete, otherwise you need to pass it on to the finish teams.

The availability of huge, free class libraries for everything from UI to analytics is probably a more significant step for the creation of attractive, advanced software. They take out much of the grunt from the old days when you had to write your own linked list package for each new data structure.

The real problem, however is that the vast increase in expectations and complexity in apps leads absorbs all the improvements in developer performance. We wouldn't need the wonderful jquery transitions functions if we were happy with content simply changing. Extra time that was gained by using things like comms libraries are absorbed by fiddling around with graphics.

The consumer revolution is out the bag and we need to consider any software team a multi-disciplinary one, including designers, writers, and QA as well coders. About the only thing we can be sure of is that complexity is not going down and that expectations are going up while we set yet more form factors and channels appearing.