Category Trends

iPhone, Verizon, Android, and the Carrier Model

Verizon iPhone on Flickr
Photo: jfingas on Flickr

Tomorrow, the wireless arm of Verizon will announce at a press conference in New York City that they will begin a CDMA version of Apple’s iPhone 4. This will end forty-two months of exclusivity in the US the device has had with AT&T Mobility. It will mean that Verizon is able to stem customer defection to AT&T for those wanting iPhone, will increase ARPU and data adoption, and cost AT&T in subscriber growth and post-paid customer churn. But that’s not what makes this announcement interesting. iPhone is important for many reasons, but it unique insofar as it has revealed quite a bit about the carrier relationship with subscribers and handset makers that was never a point of focus when the market was filled with dime-a-dozen flip phones and clunky email devices.

It’s rumored that when Apple first set out to partner with a US mobile carrier for its first foray into making a phone, it went to Verizon Wireless. This partnership was, of course, not to be — but why? It’s a story of control. Both Apple and Verizon are notorious for their need for control. Verizon has never seen itself as one to be a dumb pipe or merely a service provider. Nearly every device that has been released for use on their network has been branded and customized through and through to offer an experience that puts its services from and center (e.g., VCAST media stores, branded LBS products, mobile web), often at the expense of ODM (original device manufacturer) provided functionality. Even basic messaging products were rebranded to fall in line with their TXT/PIX/FLIX unified branding strategy. Smartphones were no different, and even presently sold handsets have undergone the same treatment where devices include the Verizon suite of “value-added” products that lead back into its walled content garden. iPhone, in all its iterations with AT&T and across the globe, are not this way. Software update schedules and functionality are managed by Apple and the carrier interference ends at the network backend for things like Visual Voicemail.

But it’s unfair to place Verizon under such scrutiny for how they managed their phone lineup and the device software. AT&T is in many ways the same with their Media Mall/AppCenter and similarly branded services. Sprint and T-Mobile too. But on iPhone, this wasn’t the case. Even when AT&T brought some of their services like its cobranded TeleNav offering to iOS and Family Map, the devices weren’t permitted to be preloaded on the device, but rather had to be downloaded from the App Store.

The phone is pristine from the consumer perspective, as though it managed to make it out of the wild without any battle scars from the carriers. iPhone bears no carrier marking on it to mar the clean design, and the software is all Apple. And it is this that defines the experience that smartphone buyers allude to when they express interest in iPhone by brand rather than fully articulating what about it is better. It has been doubly vexing for competing carriers to fully counter the demands by subscribers for why it isn’t available on their network. That is, because Apple has marketed it in a way that is carrier independent. They promoted an experience that customer’s could expect but didn’t tie it to their launch partner. It’s a phone that happened to be on AT&T, but it wasn’t an AT&T iPhone. For most, they thought it was like an iPod in that sense. CDMA, GSM, EV-DO Rev. A and HSPA+ are not things the average consumer worries about, and they shouldn’t have to.

In that sense, iPhone was a revelation about how Americans bought phones and how they carrier model was subject to change. Before this device, customers didn’t take the time or ever need to know that the BlackBerry 8310 was an AT&T exclusive with GPS but the 8320 had WiFi on T-Mobile, and that Verizon had the 8330 which had an unlocked location radio. They just walked into their carrier’s retail location when their two year was up and saw a BlackBerry there and were satisfied. It was around the time of the iPhone debut that phone exclusivity because a big deal in the US. Verizon scrambled to find something that would appease customers; this ended up being the colossal dud known as the BlackBerry Storm and Storm2, both exclusives to them in the US. AT&T currently holds an exclusive on the BlackBerry Torch for the time being.

And then there’s Android. While this deserves an article in and of itself, it’s necessary to briefly touch upon it. After the failure of the BlackBerry Storm to satisfy customers wanting a touch screen device on Verizon, they bet the farm on Android, Google’s iOS competitor that had been first commercially released in the G1 on T-Mobile. In October of 2009, Verizon, Motorola and Google announced the Droid A855, the first successful Android based handset to fall into consumers’ pockets. It was fully featured at the time, had strong branding licensed from LucasFilm, and an healthy advertising budget. Droid from Verizon was truly the first device that was at least 70% as good as iPhone, and for most people that was enough. But the interesting data point from here relevant to the carrier discussion is how Droid for Verizon was introduced in marketed.

DROID. When you read that, you knew it meant Android but you thought Verizon. And that’s how the story gets interesting. On marketing material and ads, you see the device but you’re shown it as though it was a Verizon product. It’s not the Motorola A855 Droid. It’s Droid on Verizon (from Motorola). And it was smart, because the Droid branding was not for just one device, it was for an entire class of phone that evolved with a new model every three months. But Android as an OS and concept became popularized as Droid even if it wasn’t on Verizon. And better yet, it worked because if you were to poll a random sampling of the public, a significant percentage would tell you that Android is a Verizon product. It’s epitomized by the launch event for the original Droid device in which the corporate logos were arranged, from left to right: Verizon, Motorola, Google.

You can’t count on that happening this time around. iPhone is not a Verizon property, it’s an Apple device that will run on Verizon’s voice and 3G data network. It will not be an LTE-capable device. It will not carry a Verizon logo, a checkmark, or a red and black splash screen before boot. Verizon knows this, and it’s why they’ve spent so much to differentiate themselves through marketing to position themselves as the market leader in reliability and overall coverage because the differentiation will not be on the device level. Even when it was taking on AT&T and its most popular device, it never sought to convey that the iPhone was a bad product (save for the ill-fated iDont ad), it was that it was on a sub-par and over-saturated network. That will change tomorrow.

And things just got a whole lot more interesting for Android ODMs that were relying on people who won’t or can’t leave Verizon, who now can get the device they actually wanted.

The Sad Tale of Palm and webOS – Part 1: Business and Marketing

Palm Pre Plus - App Image
Palm holds a soft spot in every gadget geek’s heart. I’ve been a fan since my first Palm device, the Palm III — perhaps the first device that showed me where things were headed and the potential of mobile computing. It was followed with a Sony Clie (a palm licensee), a Treo 600 and a few others. But Palm faced a problem in that the operating system was stagnating. While Windows Mobile, albeit less user friendly, was innovating at a solid clip and was the software powering some of the most compelling mobile devices of the mid-2000s, Palm was finding any way they could deliver rehashes of the same device running Palm OS 5. That version, which powered the Treo 600, 650, 700p, 755p and the Centro, a device that up until last year was still sold by the major US carriers, was introduced in 2002.

The company had many challenges ahead of it as BlackBerry hit the big time, securing not only the enterprise and SMB markets but also achieving success in the consumer space with the Curve and Pearl models, and Apple who released the iPhone in 2007 which spelled doom for last generation touch devices like the Treo.

In January 2009, Palm announced its answer to the competition and placed its future in the hands of its new webOS software. Early adopters and investors seemed impressed but unfortunately it failed. Palm was acquired by HP (HPQ) on April 28th, 2010. The failure is three fold: business and marketing, software, and lastly hardware. In this post, I’ll cover the business aspects of the situation.

Palm chose Sprint Nextel as their launch partner of the device in 2009. The exclusive launch partner. It made sense, somewhat. Sprint was still hemorrhaging customers because of its divided attention after the Nextel acquisition, lackluster customer service, and deficit of focus. Palm expected that it could divide marketing costs with Sprint, receive more prominent placement in stores and have the Palm Pre, the first webOS device, serve as the flagship device for Sprint’s rebranding campaign. Having announced the device five months before it would actually ship, the hype machine was in full force and many were hopeful for its resurgence. Palm was trading at $1.42 as of December 8th, 2008 on the NASDAQ and by the launch of the device, it was at $13.01 in early June.

The single carrier exclusivity strategy was ill-fated. It limited the audience of the Pre to just too few consumers. In the quarter leading up to the release of the device, Sprint reported that it had lost 1,250,000 postpaid customers, 531,000 from the Sprint CDMA side of the business. That meant that at launch time, Palm had an possible embedded sales base of just 25.3 million customers, because remember, even though Sprint had 49.3 million subscribers, only a little over half are on post-paid CDMA contracts. Sprint led the industry in the worst way possible with 2.25% postpaid subscriber churn. Sprint bet, and most likely made concessions for the exclusivity, to draw in and keep high-ARPU smartphone users who would want the Palm Pre.

Yet Palm, even riding the wave of pre-CES hype in 2009, didn’t have a magical and lust-worthy device like the iPhone which would lead to customers ditching their old wireless providers in droves. Apple didn’t have that problem; even though they were new to the phone business, they picked a huge carrier and had a built in audience that would pay an early termination fee just because Steve Jobs blinked in their general direction. Palm had no such luxury, and they blew it. It wasn’t until the following CES in January 2010 when Palm announced the device would land on its second US carrier, Verizon Wireless, which would prove to be too little too late as the momentum in the smartphone space had shifted to Android and BlackBerry for other carriers and to iPhone for AT&T.

Accuweather.com had 858,348 pixels to tell me the weather and couldn't do it…

Where's the beef? I mean, weather?

This is the page I got as a result of typing in my ZIP code to get the weather report yesterday. It measures 1266 pixels wide and 678 pixels tall. That’s a total of 858,348 pixels with which they could have told me the weather. That, was  – I thought, a simple request. And apparently it isn’t. I don’t run Adblock, but apparently I should. I usually go to wunderground.com or weather.com, both of which are far more tolerable, but I thought I should make mention of this anyway.

There comes a point when a company, in this case – Accuweather, needs to ask themselves: what the fuck are you doing? Do you have that little respect for the people who use your site?

Prove that Robert Scoble was not dropped as a baby

“Prove that Techcrunch did not pay @biz $10,000 to get on Twitter’s suggested friend list. They sure were not more popular than @leolaporte two weeks ago.” – Robert Scoble

Well, first. When you pull a ridiculous accusation out of your ass, I do believe that the burden of proof is on you. The issue here is that Robert Scoble has become increasingly more and more irrelevant as the whole free-money Web 2.0 “social networks as conversations” thing began to wane and he can’t handle that. For some inexpicable reason, there are a small few that actually do listen to him and we end up with gems like this.

Twitter has a feature called “Suggested Users” so people can find new interesting people to follow. It’s not really based on any true metric or algorithm as far as I can see and thus, it’s not a popularity contest in the way that Scoble would like things to be. Thus, he feels the need to accuse Michael Arrington of ‘bribing’ a Twitter founder to be on that list. Other people on the list are Veronica Belmont, Felicia Day, Kevin Rose and other assorted popular users. The problem here is that Scoble and others feel that every social network has to be open and trasparent so that “thought leaders” (and I mean that as a pejorative) can quantify their excellence through popularity and notoriety.

Robert Scoble is angry and whining because of the fact that he’s not on it. That’s all. He’s angry that nobody actually cares about what he has to say except desperate start-up owners looking for him to schill their product. Scoble has an always will be an attention grubing blogger who, much like Dave Winer, have completely lost sight of the revolutionary environment they were thought to be fostering in the early days of Web 2.0.

The quote link leads to a thread on FriendFeed which is an interesting read for the sheer ridiculousness of it all.

UI Design and Microsoft Windows

Four days ago, Microsoft unveiled the pre-beta of Windows 7 at PDC and offered up quite a few user interface changes meant to streamline the aging operating system. What they came up with was a taskbar that mimics the styling of the KDE on Linux and further extends the broken window preview concept introduced in Vista. Needless transparency is at every corner, another UI metaphor taken the the extreme since the introduction of Vista; and of course, more ideas from OS X have made their way into Windows, although implemented less intuitively.

I want to take a bit of time to really nail down the problems that Windows has with usability and UI design that seem to never be addressed, or just seem to get worse with each revision. This is not meant to be the usual Windows v. Mac argument that happens so often — rather, it’s a summation of the fundamental interface issues that plague Windows and prevent it from being a truly usable operating system.

One thing that OS X, and iPhone in particular, have demonstrated is a full understanding of the spatial relationships that must exist in computing. While the animations and visual effects present in Mac OS make for a great in-store demo, they serve a greater purpose – they’re visual cues that show where windows emerge from and move away to, as well as establish relationships between the windows themselves. Perhaps the quintessential example of this is Exposé. When using Exposé, you can easily view the desktop, all application windows, or just the windows related to the foremost application. It’s a useful feature that is implemented perfectly. When invoking the ‘view desktop’ key, all windows visually slide to the corners of the screen and the corners dim to reflect the temporary view scenario.

Viewing all windows or a single application’s windows dims the background, bringing focus to the windows you called upon. Each window slides into view so you know where it came from and where each will return once you’ve completed the interaction. Exposé takes a difficult UI design issue and offers an elegant and simple solution that works better than in any other OS I’ve seen to date. Minimizing and maximizing windows to and from the dock illustrate the same concept of spatial relationships and managing lots of individual windows in a graceful manner.

In the same vein, Windows suffers from one key UI design flaw – it is incapable of hiding applications. Windows offers no way to simply “hide” an application and its windows, nor does it offer a simple way to minimize a single window. This is crucial to being able to use more than a handful of applications at once and maintaining an uncluttered workflow. For example, say I’m using three Microsoft Office programs, Firefox, iTunes and Skype. In this scenario, each application has two windows open, so we have twelve windows in total. I want all of these applications open, but not all of them are relevant to the task at hand, so I’d generally have to minimize everything in Windows and rely on Alt+Tab to let me work. The taskbar would be full of individual windows squished together and navigating around is just plain cumbersome. Vista made this slightly easier by adding window previews into the application switcher, but the UI problem remains. Mac OS and other desktop environments have solved this long ago by allowing one to simply hide an application and all related windows, via menu item or keyboard shortcut, such that they aren’t visible until called upon from the dock and won’t show up in Exposé. It’s a simple idea that makes using ten to fifteen applications at a time extremely easy. Without this, Windows remains particularly unwieldy when the information you need is scattered in different programs and you have five or more Explorer windows open.

Which leads us to the culmination of the problem: Windows wasn’t originally designed to multitask effectively. As it stands, Windows retains the antiquated taskbar that lives at the bottom of the screen which becomes nearly unusable once you amass more then six windows open at a time. Some developers have tried to get around this problem by offering the option to minimize to the system tray, but it still reflects a generally poor and ill-conceived interface design. The answer to this is not increasing screen real estate as many suggest – this only encourages continuing a poor design paradigm from Microsoft. Windows has never had a great way to organize and present multiple windows. When Windows 95 came out, the taskbar and Start menu were revolutionary as a way to keep different processes in check and accessible quickly, but the flaw in the ultimate utility of this was exposed when protected memory and powerful computers made multitasking possible and painless. In its current form, the threshold of how many applications one can use at a time quickly is rather low. Some may argue it’s that there isn’t a need to keep programs open, but that is an idea borne of the usability limitations inherent in Windows.

And this speaks to the general problem that Microsoft faces today – they’re unwilling to innovate. Microsoft has such a large install base worldwide that breaking compatibility and instituting a more functional UI would draw ire from business customers and users that are set in their ways. Apple faced this same issue with the transition from OS 9 to OS X but they solved it in the most logical way they could which was allowing users to continue to boot the older OS for legacy applications. The reason that I feel this isn’t such a big problem for Microsoft is their success in the virtualization market. With Windows Server 2008, they included Hyper-V which is their superb virtualization environment where you can create virtual machines and run any x86 or x64 OS you wish. If Microsoft truly wanted to fix Windows and create a 21st century OS, they would redesign Windows and offer virtualization of Windows XP and Vista environments for older applications that haven’t been updated. This is the way enterprise has dealt with the interfacing with older database systems that don’t fit in their current infrastructure and it’s why Citrix is company with yearly revenue measured in the billions of dollars. Microsoft has demonstrated that they try to keep backwards compatibility when they can, but programs still break between revisions of Windows yet and there is little payoff in terms of security and usability. To put it plainly, Microsoft needs to quit ‘half-assing’ change and pull an Apple.

Apple and the Missing PowerBook Successor


“I missed you too” – on Flickr by süńdāyx

I’m disappointed with Apple. I’ve been using their machines since I was four years old and have been buying them personally for the past seven. It has been my preferred platform of choice and I’ve never been unhappy with the hardware choices available to me until now. I see a glaring hole in their portable line-up, a small prosumer notebook. This void had been previously filled with the 12″ PowerBook but has never been replaced since its discontinuation in early 2006. One might suggest the MacBook Air as it’s successor, but that’s not paying attention to what the 12″ PowerBook was – a small, lightweight notebook that made almost no compromises in performance and connectivity to achieve it’s minuscule footprint. I do not mean to suggest that there is not a spot in the marketplace for a thin and light MacBook Air, however it’s clear that Apple is leaving money on the table from consumers like myself searching for that elusive perfect computer in a perfect size.

But I have a dream. A dream where there is a speedy and capable notebook running Mac OS X that fulfills these wants and needs. All Apple needs to do is build it. I’ve taken the liberty of drawing up spec. sheet of what this computer should be. I give you, the perfect laptop…

MacBook Pro (13″) – Coming Soon from Apple

  • 13″ 1440×900 LCD (LED-backlight)
  • Discreet Graphics (Dual-Link DVI)
  • Intel Core 2 Duo (Montevina)
  • 2-4GB DDR3 1066MHz RAM
  • 64-128GB Solid-State Disk
  • Gigabit Ethernet Networking
  • 802.11N Wireless Networking
  • Integrated Sprint/AT&T WWAN
  • Bluetooth 2.1 + EDR
  • iSight Webcamera
  • Backlit Keyboard
  • 9-Cell Battery*
  • SDHC Reader

This would not require a feat of engineering, although I’m certain that Apple could work their usual magic and include some tremendously innovative features in this notebook. These features exist in many notebooks available today (such as the Sony VAIO SZ, ThinkPad X200/300) but prove to be flawed choices as they do not run OS X and lack the polish I expect from a laptop, which is why I’m an Apple buyer in the first place. An optical drive? Who cares about an optical drive? The world’s thinnest notebook? I don’t need it. Simply put, I want a small and powerful laptop that can handle a long day of on-the-go use and be backed by the operating system I can’t live without.

* To keep with the svelte and clean design of Apple notebooks, a smaller battery can be included and the larger 9-cell high-capacity battery would be left as a CTO option.

Benchmarking: Samsung 64GB Solid-State Disk

There has been a huge amount of hype and misinformation in the solid-state drive debate as of late and whether it’s a technology that’s ready for primetime; I recently purchased one with my newest computer and want to offer some real-world tests. The drive in question is a Samsung 64GB SATA SSD (1.8″, Model No. MCCOE64G8MPP) which came along with my ThinkPad X200, surplus from the thin-and-light X300 I’m sure. It’s a SLC (single-level cell) drive which offers faster transfers and a longer lifespan than the cheaper MLC drives that are coming onto the market, but I’ll delve into those differences a bit more later on. First, let’s see how the drive performs…

In some basic testing with the HDTune benchmarking utility, the Samsung drive performed admirably. With an average read speed of 67MB/s and a peak speed of 88MB/s, the drive offers about twice the performance of a standard 5400RPM SATA laptop hard disk. Where the drive really shines is the almost non-existent access times on your data. In this test, the average seek time was 0.3ms where a traditional notebook is comes in at 15-20ms (or about 50-60x slower). Read/write performance also does not suffer from the pitfall that platter-based drives do, which is that information reads at the same speed regardless of where the data is physically on the drive.

The file read/write benchmarks told the same story as the standard read test. When using the 64MB file size, the drive offered consistent performance peaking at about 100MB/s reading data and 90MB/s writing. Comparing this to the tests of the reference Seagate hard-disk drive, it was consistently more than twice as fast as the traditional drive peaked at 40MB/s (HDD benchmark charts are provided at the end of the article). Boot times are not a terribly relevant or accurate way to gauge a computer’s performance, but since gamers/nerds are always clamoring for them, I’ll include them anyways. With the SSD, a the laptop booted to the Windows login screen in 34 seconds and at the desktop with all startup items loaded in a total of 39 seconds. With the HDD, those same tasks were completed in 46 seconds and 58 seconds respectively. Both of these tests were with the same drive image running Windows Vista on an Intel Core 2 Duo 2.4Ghz notebook computer.

As for the SLC vs. MLC debate referenced earlier, it’s all a matter of cost. The best performing SSDs on the market are SLC drives. SLC drives offer better performance, lower power consumption and a longer-lifespan (100,000 write/erase cycles per sector as compared to 10,000 cycles on an MLC drive). MLC (multi-level cell) drives are cheaper to manufacture and are quickly becoming popular because of the lower price point. The lifespan argument loses its utility when one takes into account that 10,000 write/erase cycles is averaged/leveled out through the drive’s own firmware so the same cells aren’t constantly being rewritten (and since SSDs have a near-instant access time, there is no ill-effect on performance). Also, the useful life of a consumer notebook computer is surely less than that of the drive. In either case, a solid-state disk can greatly enhance the performance and battery life of a notebook, but it does come at a hefty cost.

Seagate HDD Benchmarks

Customer Support Failure: Microsoft Xbox (Part 1)

Last week I wrote about a better than expected experience with a company, and now it’s time to take it in a different direction. I’ve mentioned the issue I’ve been having with my Xbox 360 a couple of times on Twitter, but I’ll detail the story thus far here.

About five weeks ago, my Xbox 360 stopped retrieving an IP address from my router. I waited until the weekend to start troubleshooting it. I powercycled all the devices involved (Xbox, cable modem, AirPort Extreme) and tried again. Nothing. I then proceeded to to reset the firmware of the router, tried plugging the Xbox 360 directly into the modem, hard reset the modem, cloned the MAC address of a different device onto the 360, manually assigned the 360 an IP address rather than using DHCP, and many more. Nothing worked. So, I reluctantly called Microsoft Xbox support on Sunday (20 July) to see what could be done about this. I was walked through extensive troubleshooting steps provided to me by the Xbox Live representative, and even though I had already been through all of them, I ran through them once more. After an hour and a half on that call, being placed on hold repeatedly and embarking on remedies that made no sense, I was told to call back after I had contacted my ISP to check with them why the issue was occurring. Rather than do this, I elected to borrow another Xbox 360 console from a friend and try that. After plugging it into my normal network configuration and tossing my hard drive onto it, everything connected just fine which shows there is absolutely nothing wrong with my home network or internet connection.

I called back about an hour after the previous support call had ended. After more holding and futile support remedies, I was told the issue would need to be escalated to Microsoft for “investigation” since they were hung up on the fact that I was using an Apple AirPort Extreme (oh noes, an Apple product!). They informed me that it would take about seven business days to completely this, and while this seemed like an extraordinarily long time for such a simple issue, I said I’d wait for a call.

Two weeks had passed without a call, so I called Xbox customer support a third time on the 3rd of August. The support representative looked into my case, placed me on hold, and came back after ten minutes to tell me that a supervisor would call me in three to five business days after they had looked at the case because it was still “pending” with Microsoft. I had other things to do that day, so rather than argue it, I said fine and waited. Another week passes without a call. A week after (10th of August), I phone once more to check on what the status was and again, I was told that I would receive yet another fictitious phone call within two to three business days. Yet another week passes without any action.

Frustrated and frankly, quite angry, I called customer support once more last Sunday. I was calm and polite throughout but became increasingly unnerved at the fact that I was receiving the same story yet again. I asked vehemently why I should expect a different outcome with the “wait for a call” routine than what I had seen over the past month, and I was simply told that it would be taken care of in two days by a manager. I ended the call and stewed for a while – thankfully I have the loaner Xbox from a friend so I could take out the frustration with some Call of Duty 4.

Welp, it’s Thursday now, four days after the last call and I have (surprise, surprise) have not received a callback. I’ll be calling back tomorrow and will not accept a callback or brush-off as an acceptable solution. I’m floored that such a simple issue can be taking this long and that the investigation of said issue is being treated as though it involves a completely different company. I’ll post again with the outcome, which will hopefully, be more positive than what I’ve experienced thus far. Astonishingly, this has not soured my opinion of the Xbox brand and I’d still recommend it to anyone as I truly think it is the best gaming experience. I just hope my case is an anomaly amongst tens of thousands they work through.

The problem with Windows software developers.

The independent and smaller software developers for the Mac platform are the best in the business – they’re committed to the operating system and identify with the experience that the end-user has come to expect. On the Windows side of the aisle, this isn’t the case. Windows developers, as a community, seem horribly fragmented and do not identify with the same goal. Each brings a product to market to fulfill a gap they believe they can fill, however just how an application will work and interface with the system and other software is almost always an afterthought. Consistent GUI? What’s that? Every program believes that it either will look as bland as possible or it will try and reinvent the wheel yielding a clusterfuck of a UI.

This morning, I was looking for a Quicksilver-inspired Windows application to use in my coming experiment (details to come soon – I have to check with Amnesty International to verify whether or not it falls in the realm of torture), and I found a new one that tries in their own special way to do the same. Dash is one that really caught my attention as it seems to best capture the basic nature of Quicksilver’s search, but it led me to another facet of the Windows developer problem which is how they market their software. Take a look at the seven reasons they suggest I use Dash at their product page. Item two on that list (because everyone loves reading lists) is “Reduce Repetitive Stress Injury”. Seriously, go look, I’ll wait… Rather than the pithy “act without doing” tagline adopted by Alcor, the developer of Quicksilver, Dash takes it in the opposite direction and is positioning this as a marvel of modern medicine. I say this in jest, but the problem is that it’s not addressing a problem they can solve or the strength of their product – it strikes me as something they pulled out of their butt to fill the empty space on the site.

And then there’s price. Dash is set at $50.00 except if you purchase now, you can get their “pre-release offering price” of $19.95 saving you $30.05. Let’s skip right past how this is on par with infomercial level “buy in the next thirty minutes and get a second free!” silliness and look at how it’s they’re establishing value. Software isn’t designed to be inexpensive and offering excessive discounts can make it seem as though you’re diminishing the worth of a product; a great example of this is Transmit from Panic Software for the Mac – it’s an FTP client, but it’s billed as the FTP client. As such, Panic doesn’t discount the software to entice those who undervalue what they offer, which is a fantastic user experience and just well-designed software. Cyberduck and Fugu are free, but I saw the value in the product that made $29 palatable. What the makers of Dash are doing is, in my opinion, either mispricing their product or using used car salesman tactics to win over customers. Ignoring the fact that Quicksilver, a vastly superior product to this knock-off, is completely free – Dash is selling an application at their ‘regular price’ that pushes it into the range of what full-fledged productivity apps cost. Launchbar, a product similar to Quicksilver for Mac is priced without the gimmicks, at $19. Fair.

I could continue, but for the sake of brevity, I’ll end this here. Windows developers and software vendors have so much to learn from the Mac developer community about creating better applications, but even more importantly, about marketing. If it can be summed up in a sentence or two, it would be this – stop selling simple consumer apps the same way Microsoft and Oracle sell enterprise CRM software. Home users don’t want to see how many different ways you can say the same thing on your “features” page, they want to see you solve a problem they have and do it elegantly.

WiFi theft and the "linksys" question.

The ubiquitous SSID that has been delivering free internet access since 802.11b, but many have argued whether or not this is theft of service or criminal behavior. Unsecured internet access points like a fresh out-of-the-box Linsys router will share your internet connection for anywhere from 50 to 300 feet, likely extending beyond the walls of your home. Without encryption and a password, something rather easy to configure, anyone with a WiFi enabled device is able to hop onto the network and use your internet connection. Most open access points are open inadvertently as a result of those not savvy enough to see multiple risk of an unencrypted wireless router running – a few are open because of generosity and invite users, but let’s be realistic.

Two questions are raised with this issue – whether it is legal to use an open wireless network without explicit permission of the owner, and whether it is morally ethical to use an open network. The former is simple and difficult to answer depending on how you look at it. The development cycles of technology have far outpaced the legislative cycle – this coupled with the ineptitude of elected officials with respect to the issue makes for an unclear legal standing for many activities such as this and file sharing. It’s not an illegal practice for one primary reason: there is no circumvention of a protection scheme or barrier to entry. When one looks for a network to connect to, an attempt to connect to an open access point is met with an acceptance of the request for an IP address by the router’s DHCP server. In cases where encryption technologies are present and a pre-shared key are required to connect, attempts to circumvent this security scheme are most definitely illegal as the operator of the network has done their due diligence to protect their internet service from theft. In this scenario, theft of service is occurring, in the former, not so much.

Is it ethical? Absolutely not. This is where I differ from most technology enthusiasts (read: nerdoids) insofar as I believe accessing an open home network is wrong because you are in some capacity depriving the operator of the bandwidth they pay for. The common analogy is that you’d think of a WiFi access point as a lamp in the outside someone’s home. Is it wrong to sit under the lamp to help you read? Absolutely not. But it’s a completely flawed idea because it failed to take into account usage patterns. Using an open network is consuming a portion of the avaialble bandwidth, for the analogy to work, you’d have to say the lamp grows dimmer as another user sits under it. Remember, most of the open APs you come across in a residential area are open out of ignorance to the security risks and/or because the user just doesn’t have the knowledge to enable an encryption scheme. The other argument is that anyone is entitled to use the network because the network is being trasmitted through their walls into their property or onto public property. Again, this doesn’t put the nature of the issue into perspective – and to get technical about it, the government owns and licenses the radio airwaves.

Simply put, using an open network is not illegal. It is not theft of service since a request for access is met with acceptance rather than rejection. However, it is unethical. Have I done it? Rarely, but yes. Does that make me a hypocrite? Probably, but it’s food for thought anyway.

Rather than driving around looking for open APs or leeching off a neighbors connection, just subscribe to mobile broadband. It’s really not that expensive…

Where Activision fails, Harmonix succeeds.

In 2006, Activision acquired game peripheral manufacturer RedOctane and as a result, the rights to the ‘Guitar Hero’ game series. Whether or not this would be a productive move was to be seen. A few months later, Harmonix, the game developer of ‘Guitar Hero’, was acquired by MTV Networks. Neversoft took over development for ‘Guitar Hero III’ at Activision and the Harmonix developers were able to focus on new exciting projects with expanded access to music licensing opportunities through MTV and Viacom. One of these acquisitions made something revolutionary and one led to the dilution of a once-in-a-lifetime brand, guess which?

Activision killed the ‘Guitar Hero’ brand. With the release of ‘Guitar Hero III’, Activision made it clear they were interested in expanding the reach of the franchise by any means necessary. This meant they were going to make it available on every platform possible, whether or not it made any sense. The result? Guitar Hero on a keychain, Guitar Hero for BlackBerry, Guitar Hero for the Nintendo DS, and more. By not innovating on the flagship product itself, Activision made it clear they were not committed to creating a compelling product that redefined the music space; just market penetration. The 100,000,000 USD acquisition was a boon for the shareholders of ATVI but a net loss for the consumer and game lover.

Harmonix, free to explore the music game genre which they made viable, churned out ‘Rock Band’. This new game title changed the what the public imagined a simple rhythm game could be by incorporating drums, vocals into the usual plastic guitar fare. Innovative hardware is not the only reason it has been so successful; Harmonix and MTV made downloadable tracks a priority and as a result, new songs are released on a weekly basis. As of the publishing of this article, 100 tracks are available for download by Xbox 360 and Playstation 3 users and over 10,000,000 songs have been purchased. Downloadable content was not new; Xbox Live users were familiar with map-packs and additional content game publishers made available for sale. PSN on Playstation 3 allows for similar functionality. ‘Rock Band’ as a game proved without a doubt that game developers had the opportunity to reach the consumer after the point of sale with compelling content and interactivity. Harmonix operates a user forum and a comprehensive site for players to explore and extend their game experience. At every point where Activision failed to capitalize on the brand they acquires, Harmonix nailed it. This is why ‘Rock Band’ will be the game title that reigns supreme: extensible, social, innovative.

The ‘Guitar Hero’ brand is certainly not dead, but is nowhere near as powerful as it once was due to the systematic dilution by Activision. Executives are Activision have indicated that the next iteration of the ‘Guitar Hero’ will most likely feature drums and a microphone; essentially making it a ‘Rock Band’ knock-off. That’s not quite ‘Guitar Hero’ anymore now is it?

Personally, I’m waiting for the ‘Guitar Hero MMORPG for Nokia N-GAGE’ to be released…

They're doing IT backwards

Brain-teaser time: is it wiser to invest in network upgrades or to purchase traffic-shaping equipment in a futile effort to combat the inevitable?

This is the issue I can’t seem to understand. At UC Santa Barbara, there is a pool of 200Mb of downstream bandwidth allocated for about 7,000 students. The logical response to that statement is of course, “WTF?”. Let’s run through a few hypotheticals. Assume that a tenth of the students are watching a YouTube videos or other streaming media (the average FLV stream is 300Kb/s); that consumes the entire bandwidth pool and leaves all other HTTP traffic only to slow to a crawl. Let’s assume you throw in some rogue P2P file sharing and the infinite Facebook clicks, it’s clear that it isn’t enough. The uptick in bandwidth consumption per student is not unwarranted with the shift in how individuals consume their media and it is only going to continue. Now, as a network administrator, how do you solve the issue? By spending tens of thousands of dollars (per device) for traffic-shaping equipment that prioritizes other traffic over others, deprioritizes ‘recreational’ activities to create the illusion of a functioning network.

That is the case here. A common cable equipped-houshold has access to to upwards of 10 megabits of downstream bandwidth. One can argue that this is not guaranteed, but that’s beyond the point. Let’s say that’s functional for a household of five average internet users. That would suggest the average user requires two megabits each. Let’s divide that by four since we assume most users won’t be constantly pulling data or won’t be online at the same time. By this math, a campus with 7,000 residents would need a 3.5 gigabit downstream link for acceptable performance. Compare that to 200 megabits; the current network capacity is less than one-seventeenth what it should. I don’t expect some type of miracle due to both budget and infrastructure limitations, but an honest evaluation leads on to believe that devices from Packeteer and Procera that cost upwards of $20,000 per box is not the best course of action and merely offsets the inevitable.

Links: Procera Networks, Packeteer PacketShaper, Traffic-shaping at Wikipedia.