One of the huge benefits of having moved to San Francisco last fall is that I get to attend all the neat things that are hosted here. This week was the Macworld Conference and Expo 2011 at Moscone West presented by IDG. While it seemed like a smaller vendor showing than I’ve seen or heard of in the past, there were still plenty of great things to see. OmniGroup had a huge booth where they hosted Q&A sessions and product session; HP, SMART, Fujitsu, Nuance, HyperJuice and others also had a big showing but Canon, Adobe and others were no shows. But, on the upside, Macworld (the magazine) had many of their editors and staff on stage at Macworld Live to host a few panels, including a great one on ‘Future of the Mac’ with John Gruber. I’ve posted the photos I took onto Flickr and added them to the event pool.
Tomorrow, the wireless arm of Verizon will announce at a press conference in New York City that they will begin a CDMA version of Apple’s iPhone 4. This will end forty-two months of exclusivity in the US the device has had with AT&T Mobility. It will mean that Verizon is able to stem customer defection to AT&T for those wanting iPhone, will increase ARPU and data adoption, and cost AT&T in subscriber growth and post-paid customer churn. But that’s not what makes this announcement interesting. iPhone is important for many reasons, but it unique insofar as it has revealed quite a bit about the carrier relationship with subscribers and handset makers that was never a point of focus when the market was filled with dime-a-dozen flip phones and clunky email devices.
It’s rumored that when Apple first set out to partner with a US mobile carrier for its first foray into making a phone, it went to Verizon Wireless. This partnership was, of course, not to be — but why? It’s a story of control. Both Apple and Verizon are notorious for their need for control. Verizon has never seen itself as one to be a dumb pipe or merely a service provider. Nearly every device that has been released for use on their network has been branded and customized through and through to offer an experience that puts its services from and center (e.g., VCAST media stores, branded LBS products, mobile web), often at the expense of ODM (original device manufacturer) provided functionality. Even basic messaging products were rebranded to fall in line with their TXT/PIX/FLIX unified branding strategy. Smartphones were no different, and even presently sold handsets have undergone the same treatment where devices include the Verizon suite of “value-added” products that lead back into its walled content garden. iPhone, in all its iterations with AT&T and across the globe, are not this way. Software update schedules and functionality are managed by Apple and the carrier interference ends at the network backend for things like Visual Voicemail.
But it’s unfair to place Verizon under such scrutiny for how they managed their phone lineup and the device software. AT&T is in many ways the same with their Media Mall/AppCenter and similarly branded services. Sprint and T-Mobile too. But on iPhone, this wasn’t the case. Even when AT&T brought some of their services like its cobranded TeleNav offering to iOS and Family Map, the devices weren’t permitted to be preloaded on the device, but rather had to be downloaded from the App Store.
The phone is pristine from the consumer perspective, as though it managed to make it out of the wild without any battle scars from the carriers. iPhone bears no carrier marking on it to mar the clean design, and the software is all Apple. And it is this that defines the experience that smartphone buyers allude to when they express interest in iPhone by brand rather than fully articulating what about it is better. It has been doubly vexing for competing carriers to fully counter the demands by subscribers for why it isn’t available on their network. That is, because Apple has marketed it in a way that is carrier independent. They promoted an experience that customer’s could expect but didn’t tie it to their launch partner. It’s a phone that happened to be on AT&T, but it wasn’t an AT&T iPhone. For most, they thought it was like an iPod in that sense. CDMA, GSM, EV-DO Rev. A and HSPA+ are not things the average consumer worries about, and they shouldn’t have to.
In that sense, iPhone was a revelation about how Americans bought phones and how they carrier model was subject to change. Before this device, customers didn’t take the time or ever need to know that the BlackBerry 8310 was an AT&T exclusive with GPS but the 8320 had WiFi on T-Mobile, and that Verizon had the 8330 which had an unlocked location radio. They just walked into their carrier’s retail location when their two year was up and saw a BlackBerry there and were satisfied. It was around the time of the iPhone debut that phone exclusivity because a big deal in the US. Verizon scrambled to find something that would appease customers; this ended up being the colossal dud known as the BlackBerry Storm and Storm2, both exclusives to them in the US. AT&T currently holds an exclusive on the BlackBerry Torch for the time being.
And then there’s Android. While this deserves an article in and of itself, it’s necessary to briefly touch upon it. After the failure of the BlackBerry Storm to satisfy customers wanting a touch screen device on Verizon, they bet the farm on Android, Google’s iOS competitor that had been first commercially released in the G1 on T-Mobile. In October of 2009, Verizon, Motorola and Google announced the Droid A855, the first successful Android based handset to fall into consumers’ pockets. It was fully featured at the time, had strong branding licensed from LucasFilm, and an healthy advertising budget. Droid from Verizon was truly the first device that was at least 70% as good as iPhone, and for most people that was enough. But the interesting data point from here relevant to the carrier discussion is how Droid for Verizon was introduced in marketed.
DROID. When you read that, you knew it meant Android but you thought Verizon. And that’s how the story gets interesting. On marketing material and ads, you see the device but you’re shown it as though it was a Verizon product. It’s not the Motorola A855 Droid. It’s Droid on Verizon (from Motorola). And it was smart, because the Droid branding was not for just one device, it was for an entire class of phone that evolved with a new model every three months. But Android as an OS and concept became popularized as Droid even if it wasn’t on Verizon. And better yet, it worked because if you were to poll a random sampling of the public, a significant percentage would tell you that Android is a Verizon product. It’s epitomized by the launch event for the original Droid device in which the corporate logos were arranged, from left to right: Verizon, Motorola, Google.
You can’t count on that happening this time around. iPhone is not a Verizon property, it’s an Apple device that will run on Verizon’s voice and 3G data network. It will not be an LTE-capable device. It will not carry a Verizon logo, a checkmark, or a red and black splash screen before boot. Verizon knows this, and it’s why they’ve spent so much to differentiate themselves through marketing to position themselves as the market leader in reliability and overall coverage because the differentiation will not be on the device level. Even when it was taking on AT&T and its most popular device, it never sought to convey that the iPhone was a bad product (save for the ill-fated iDont ad), it was that it was on a sub-par and over-saturated network. That will change tomorrow.
And things just got a whole lot more interesting for Android ODMs that were relying on people who won’t or can’t leave Verizon, who now can get the device they actually wanted.
Keeping your data organized is hard. Keeping your data organized across many devices is harder.
Today, I’m going to show you how to keep your calendar, contacts and email in sync across devices using Google as the glue. As a bonus, I’ll even tell you how to have your files and notes everywhere too. Be advised that some of the puzzle pieces here are subscription based and cost money — things that work well usually do.
My data is stored across three applications on OS X: Mail, iCal and Address Book. They’re simple, fast and to the point. Email is painless to access across devices because of IMAP and its quick using IDLE, but calendar and contacts get tricky. Address Book is unique insofar as it has built-in support for Google syncing out of the box. It does, however, have a knack for forgetting to continue syncing after a while. iCal is the big one for most people and there are ways to rig syncing for free using CalDAV and tools like Calaboration. It quickly becomes a mess and is beyond most people; be sure you have backups of your data before trying.
With your Gmail or Google Apps email account, you’ve already got Google Calendar and Google Contacts. With these, you’ll be able to access contacts and calendar using any web browser, as well as push syncing on Android smartphones and iPhone.
Getting your data into Google’s cloud services is simple with the help of Spanning Sync. It installs on your Mac(s) as a preference pane that coordinates hourly syncs to Google of your iCal and Address Book data. Pricing is fair at $25/year or $65 for a lifetime subscription. Their pricing model is per Google account rather than per device which is a plus if you have more than one Mac.
Prior to starting your first sync with this, it’s always a good idea to backup your data. You can do this in both iCal and Address Book easily by going to File > Export > Archive in each program. Syncing can get messy and having backup takes a few seconds and saves hours of headaches.
Go ahead and start your first sync, it’ll take a while depending on how much data you have. Upon completion, explore iCal and Google Calendar and make sure events are linked to the correct calendars and that any recurring events are as they should be. For your contacts, compare the number of contacts in Google Contacts to the status bar’s count in Address Book. Everything matches up? Great. We’re half way done.
With respect to mobile devices, I have two to worry about: an iPhone 4 and a Droid Incredible (Android 2.2). Since we’re in the Google ecosystem now, the Android part of the equation is simple. Log into the Gmail or Google Apps account and enable syncing for all three categories (Calendar, Contacts and Mail). You’re data is set once you see the icon disappear in the notification bar.
Now onto iPhone. Most people are syncing their Gmail and Google Apps with iPhone incorrectly, perhaps because they don’t know there is a better way. Google has licensed Exchange ActiveSync from Microsoft and uses this as a simple way to get comprehensive PIM syncing working on iOS as part of Google Sync. What you’ll need to do is set up a new mail account in iOS 4. Select “Microsoft Exchange” as the type, use your Gmail username or full Google Apps address as the username, leave the domain blank and use “m.google.com” as the server address and you’re set. Ignore any certificate warnings that pop up and ensure SSL is checked. Give it a few minutes and your data will be synced to iPhone as well. One benefit you’ll notice about this method for accessing Gmail, other than it now syncs your contacts and calendar too, is that mail is now delivered via push instantly. Nifty.
Here are some other steps you may or may not need to worry about. If you have multiple iCal/Google Calendar calendars, you’ll need to manually turn on syncing of these for iPhone as by default, only your initial Google Calendar gets access via Exchange ActiveSync. Go to m.google.com/sync on your iPhone, tap on “Calendars”, and add a checkmark to those you want on your device. You’d also want to disable syncing of your contacts and calendars when linked to iTunes since that will have been obviated by over-the-air syncing. Lastly, for Google Apps users, Google Sync needs to be enabled in order for this to work. You can do this by logging into your “Manage this domain” control panel, clicking on “Service settings” and then “Mobile”. Check the box next to “Enable Google Sync” and you’re set.
One more thing about iPhone and Google via Exchange. When you add new contacts to iPhone, be default it will save them to the phone only in its own address book. Go to “Settings”, “Mail, Contacts, Calendars”, scroll down to “Default Account” and change it to your Exchange ActiveSync account.
That’s it. Your contacts and calendar and mail are all syncing over the air, accessible across your Macs, on the web, and on your mobile devices all the time. Changes on one reflect on the other and you don’t have to worry about plugging in your iPhone for anything other than adding music and updating podcasts.
I’ll cover text capture and file access later this week, but it’s easy: Simplenote for web and iPhone, Notational Velocity for Mac and Andronoter on Android. Don’t touch Evernote with a ten-foot clown pole.
Ten years ago, if someone had told you that in the near future, your shoes would talk to your mobile phone as you run and that your phone would connect wirelessly with a pair of stereo headphones for music, all the while allowing you to play Scrabble with a friend who lives three thousand miles away in New York — you would have probably told them to leave some cocaine for the rest of us.
What’s even more shocking is that it all works together flawlessly, links to an online service to share and compete with others and is completely reasonable in price. It may not be a personal jetpack or teleportation, but sometimes you have to take what you can get. With respect to Apple in this scenario, it’s a testament to the restraint to adopt and innovate new and existing technologies to make complex ideas into simple and compelling features for the end-user. It’s not about being first to market or designing a device that wins on a spec sheet alone; it’s about offering features that translate into practical usability.
Palm holds a soft spot in every gadget geek’s heart. I’ve been a fan since my first Palm device, the Palm III — perhaps the first device that showed me where things were headed and the potential of mobile computing. It was followed with a Sony Clie (a palm licensee), a Treo 600 and a few others. But Palm faced a problem in that the operating system was stagnating. While Windows Mobile, albeit less user friendly, was innovating at a solid clip and was the software powering some of the most compelling mobile devices of the mid-2000s, Palm was finding any way they could deliver rehashes of the same device running Palm OS 5. That version, which powered the Treo 600, 650, 700p, 755p and the Centro, a device that up until last year was still sold by the major US carriers, was introduced in 2002.
The company had many challenges ahead of it as BlackBerry hit the big time, securing not only the enterprise and SMB markets but also achieving success in the consumer space with the Curve and Pearl models, and Apple who released the iPhone in 2007 which spelled doom for last generation touch devices like the Treo.
In January 2009, Palm announced its answer to the competition and placed its future in the hands of its new webOS software. Early adopters and investors seemed impressed but unfortunately it failed. Palm was acquired by HP (HPQ) on April 28th, 2010. The failure is three fold: business and marketing, software, and lastly hardware. In this post, I’ll cover the business aspects of the situation.
Palm chose Sprint Nextel as their launch partner of the device in 2009. The exclusive launch partner. It made sense, somewhat. Sprint was still hemorrhaging customers because of its divided attention after the Nextel acquisition, lackluster customer service, and deficit of focus. Palm expected that it could divide marketing costs with Sprint, receive more prominent placement in stores and have the Palm Pre, the first webOS device, serve as the flagship device for Sprint’s rebranding campaign. Having announced the device five months before it would actually ship, the hype machine was in full force and many were hopeful for its resurgence. Palm was trading at $1.42 as of December 8th, 2008 on the NASDAQ and by the launch of the device, it was at $13.01 in early June.
The single carrier exclusivity strategy was ill-fated. It limited the audience of the Pre to just too few consumers. In the quarter leading up to the release of the device, Sprint reported that it had lost 1,250,000 postpaid customers, 531,000 from the Sprint CDMA side of the business. That meant that at launch time, Palm had an possible embedded sales base of just 25.3 million customers, because remember, even though Sprint had 49.3 million subscribers, only a little over half are on post-paid CDMA contracts. Sprint led the industry in the worst way possible with 2.25% postpaid subscriber churn. Sprint bet, and most likely made concessions for the exclusivity, to draw in and keep high-ARPU smartphone users who would want the Palm Pre.
Yet Palm, even riding the wave of pre-CES hype in 2009, didn’t have a magical and lust-worthy device like the iPhone which would lead to customers ditching their old wireless providers in droves. Apple didn’t have that problem; even though they were new to the phone business, they picked a huge carrier and had a built in audience that would pay an early termination fee just because Steve Jobs blinked in their general direction. Palm had no such luxury, and they blew it. It wasn’t until the following CES in January 2010 when Palm announced the device would land on its second US carrier, Verizon Wireless, which would prove to be too little too late as the momentum in the smartphone space had shifted to Android and BlackBerry for other carriers and to iPhone for AT&T.
When Apple announced the unibody MacBook Pro, I ordered one immediately after to replace my aging MacBook Pro. It was a great machine except for the new screen design that Apple adopted, a LED-backlit LCD seated behind a sheet of thin and highly reflective glass that made outdoor use and work in bright environments nearly impossible because of reflections.
Luckily, this week – Apple decided to offer a matte “anti-glare” display option on the 15″ MacBook Pro. I purchased one Friday night from the Apple Store and am posting a few pictures here since it seems to be mildly elusive and CTO MacBook Pros from the website haven’t shipped yet. Thus far, I’m really enjoying it. Take a look below.
It looks very similar to the pre-unibody MacBook Pro. Wider bezel than the 17″.
You can find more pictures over on my Flickr photostream.
“Prove that Techcrunch did not pay @biz $10,000 to get on Twitter’s suggested friend list. They sure were not more popular than @leolaporte two weeks ago.” – Robert Scoble
Well, first. When you pull a ridiculous accusation out of your ass, I do believe that the burden of proof is on you. The issue here is that Robert Scoble has become increasingly more and more irrelevant as the whole free-money Web 2.0 “social networks as conversations” thing began to wane and he can’t handle that. For some inexpicable reason, there are a small few that actually do listen to him and we end up with gems like this.
Twitter has a feature called “Suggested Users” so people can find new interesting people to follow. It’s not really based on any true metric or algorithm as far as I can see and thus, it’s not a popularity contest in the way that Scoble would like things to be. Thus, he feels the need to accuse Michael Arrington of ‘bribing’ a Twitter founder to be on that list. Other people on the list are Veronica Belmont, Felicia Day, Kevin Rose and other assorted popular users. The problem here is that Scoble and others feel that every social network has to be open and trasparent so that “thought leaders” (and I mean that as a pejorative) can quantify their excellence through popularity and notoriety.
Robert Scoble is angry and whining because of the fact that he’s not on it. That’s all. He’s angry that nobody actually cares about what he has to say except desperate start-up owners looking for him to schill their product. Scoble has an always will be an attention grubing blogger who, much like Dave Winer, have completely lost sight of the revolutionary environment they were thought to be fostering in the early days of Web 2.0.
The quote link leads to a thread on FriendFeed which is an interesting read for the sheer ridiculousness of it all.
I can’t believe that Congress and the new administration actually passed an extension for the US Digital TV transition deadline, but alas – here we are. But look, I’ve solved the entire problem in two minutes. After the deadline, display this test card on the analog channels for one week. When old people find out their TV no longer works, they’ll call a somebody young and useful to the world and find they need a converter box or a new television in order to keep watching Wheel of Fortune and Matlock.
You’re welcome, federal government.
Four days ago, Microsoft unveiled the pre-beta of Windows 7 at PDC and offered up quite a few user interface changes meant to streamline the aging operating system. What they came up with was a taskbar that mimics the styling of the KDE on Linux and further extends the broken window preview concept introduced in Vista. Needless transparency is at every corner, another UI metaphor taken the the extreme since the introduction of Vista; and of course, more ideas from OS X have made their way into Windows, although implemented less intuitively.
I want to take a bit of time to really nail down the problems that Windows has with usability and UI design that seem to never be addressed, or just seem to get worse with each revision. This is not meant to be the usual Windows v. Mac argument that happens so often — rather, it’s a summation of the fundamental interface issues that plague Windows and prevent it from being a truly usable operating system.
One thing that OS X, and iPhone in particular, have demonstrated is a full understanding of the spatial relationships that must exist in computing. While the animations and visual effects present in Mac OS make for a great in-store demo, they serve a greater purpose – they’re visual cues that show where windows emerge from and move away to, as well as establish relationships between the windows themselves. Perhaps the quintessential example of this is Exposé. When using Exposé, you can easily view the desktop, all application windows, or just the windows related to the foremost application. It’s a useful feature that is implemented perfectly. When invoking the ‘view desktop’ key, all windows visually slide to the corners of the screen and the corners dim to reflect the temporary view scenario.
Viewing all windows or a single application’s windows dims the background, bringing focus to the windows you called upon. Each window slides into view so you know where it came from and where each will return once you’ve completed the interaction. Exposé takes a difficult UI design issue and offers an elegant and simple solution that works better than in any other OS I’ve seen to date. Minimizing and maximizing windows to and from the dock illustrate the same concept of spatial relationships and managing lots of individual windows in a graceful manner.
In the same vein, Windows suffers from one key UI design flaw – it is incapable of hiding applications. Windows offers no way to simply “hide” an application and its windows, nor does it offer a simple way to minimize a single window. This is crucial to being able to use more than a handful of applications at once and maintaining an uncluttered workflow. For example, say I’m using three Microsoft Office programs, Firefox, iTunes and Skype. In this scenario, each application has two windows open, so we have twelve windows in total. I want all of these applications open, but not all of them are relevant to the task at hand, so I’d generally have to minimize everything in Windows and rely on Alt+Tab to let me work. The taskbar would be full of individual windows squished together and navigating around is just plain cumbersome. Vista made this slightly easier by adding window previews into the application switcher, but the UI problem remains. Mac OS and other desktop environments have solved this long ago by allowing one to simply hide an application and all related windows, via menu item or keyboard shortcut, such that they aren’t visible until called upon from the dock and won’t show up in Exposé. It’s a simple idea that makes using ten to fifteen applications at a time extremely easy. Without this, Windows remains particularly unwieldy when the information you need is scattered in different programs and you have five or more Explorer windows open.
Which leads us to the culmination of the problem: Windows wasn’t originally designed to multitask effectively. As it stands, Windows retains the antiquated taskbar that lives at the bottom of the screen which becomes nearly unusable once you amass more then six windows open at a time. Some developers have tried to get around this problem by offering the option to minimize to the system tray, but it still reflects a generally poor and ill-conceived interface design. The answer to this is not increasing screen real estate as many suggest – this only encourages continuing a poor design paradigm from Microsoft. Windows has never had a great way to organize and present multiple windows. When Windows 95 came out, the taskbar and Start menu were revolutionary as a way to keep different processes in check and accessible quickly, but the flaw in the ultimate utility of this was exposed when protected memory and powerful computers made multitasking possible and painless. In its current form, the threshold of how many applications one can use at a time quickly is rather low. Some may argue it’s that there isn’t a need to keep programs open, but that is an idea borne of the usability limitations inherent in Windows.
And this speaks to the general problem that Microsoft faces today – they’re unwilling to innovate. Microsoft has such a large install base worldwide that breaking compatibility and instituting a more functional UI would draw ire from business customers and users that are set in their ways. Apple faced this same issue with the transition from OS 9 to OS X but they solved it in the most logical way they could which was allowing users to continue to boot the older OS for legacy applications. The reason that I feel this isn’t such a big problem for Microsoft is their success in the virtualization market. With Windows Server 2008, they included Hyper-V which is their superb virtualization environment where you can create virtual machines and run any x86 or x64 OS you wish. If Microsoft truly wanted to fix Windows and create a 21st century OS, they would redesign Windows and offer virtualization of Windows XP and Vista environments for older applications that haven’t been updated. This is the way enterprise has dealt with the interfacing with older database systems that don’t fit in their current infrastructure and it’s why Citrix is company with yearly revenue measured in the billions of dollars. Microsoft has demonstrated that they try to keep backwards compatibility when they can, but programs still break between revisions of Windows yet and there is little payoff in terms of security and usability. To put it plainly, Microsoft needs to quit ‘half-assing’ change and pull an Apple.
Continuing with the recent string of laptop related posts, I thought I should post a few of the disassembly photos I took of the Lenovo X200. Taking the laptop apart is rather simple – just remove the screws on the the laptop (depending on what you want to remove, you only have to unscrew certain ones) and carefully pull off the lower top casing. After that, remove the keyboard (be careful of the ribbon cable to snaps onto the motherboard for the keyboard/TrackPoint. From here, you can swap out the WiFi card or just have a quick look around. For replacing the RAM or HDD/SSD, you don’t need to take apart the laptop. Just remove the side drive bay cover (one screw) or remove the two that secure the RAM bay cover on the bottom. I should have a full review of the laptop up sometime this week, but I hope these photos are useful. If you’re unable to view the slideshow, check out the image set on Flickr.
“I missed you too” – on Flickr by süńdāyx
I’m disappointed with Apple. I’ve been using their machines since I was four years old and have been buying them personally for the past seven. It has been my preferred platform of choice and I’ve never been unhappy with the hardware choices available to me until now. I see a glaring hole in their portable line-up, a small prosumer notebook. This void had been previously filled with the 12″ PowerBook but has never been replaced since its discontinuation in early 2006. One might suggest the MacBook Air as it’s successor, but that’s not paying attention to what the 12″ PowerBook was – a small, lightweight notebook that made almost no compromises in performance and connectivity to achieve it’s minuscule footprint. I do not mean to suggest that there is not a spot in the marketplace for a thin and light MacBook Air, however it’s clear that Apple is leaving money on the table from consumers like myself searching for that elusive perfect computer in a perfect size.
But I have a dream. A dream where there is a speedy and capable notebook running Mac OS X that fulfills these wants and needs. All Apple needs to do is build it. I’ve taken the liberty of drawing up spec. sheet of what this computer should be. I give you, the perfect laptop…
MacBook Pro (13″) – Coming Soon from Apple
- 13″ 1440×900 LCD (LED-backlight)
- Discreet Graphics (Dual-Link DVI)
- Intel Core 2 Duo (Montevina)
- 2-4GB DDR3 1066MHz RAM
- 64-128GB Solid-State Disk
- Gigabit Ethernet Networking
- 802.11N Wireless Networking
- Integrated Sprint/AT&T WWAN
- Bluetooth 2.1 + EDR
- iSight Webcamera
- Backlit Keyboard
- 9-Cell Battery*
- SDHC Reader
This would not require a feat of engineering, although I’m certain that Apple could work their usual magic and include some tremendously innovative features in this notebook. These features exist in many notebooks available today (such as the Sony VAIO SZ, ThinkPad X200/300) but prove to be flawed choices as they do not run OS X and lack the polish I expect from a laptop, which is why I’m an Apple buyer in the first place. An optical drive? Who cares about an optical drive? The world’s thinnest notebook? I don’t need it. Simply put, I want a small and powerful laptop that can handle a long day of on-the-go use and be backed by the operating system I can’t live without.
* To keep with the svelte and clean design of Apple notebooks, a smaller battery can be included and the larger 9-cell high-capacity battery would be left as a CTO option.
There has been a huge amount of hype and misinformation in the solid-state drive debate as of late and whether it’s a technology that’s ready for primetime; I recently purchased one with my newest computer and want to offer some real-world tests. The drive in question is a Samsung 64GB SATA SSD (1.8″, Model No. MCCOE64G8MPP) which came along with my ThinkPad X200, surplus from the thin-and-light X300 I’m sure. It’s a SLC (single-level cell) drive which offers faster transfers and a longer lifespan than the cheaper MLC drives that are coming onto the market, but I’ll delve into those differences a bit more later on. First, let’s see how the drive performs…
In some basic testing with the HDTune benchmarking utility, the Samsung drive performed admirably. With an average read speed of 67MB/s and a peak speed of 88MB/s, the drive offers about twice the performance of a standard 5400RPM SATA laptop hard disk. Where the drive really shines is the almost non-existent access times on your data. In this test, the average seek time was 0.3ms where a traditional notebook is comes in at 15-20ms (or about 50-60x slower). Read/write performance also does not suffer from the pitfall that platter-based drives do, which is that information reads at the same speed regardless of where the data is physically on the drive.
The file read/write benchmarks told the same story as the standard read test. When using the 64MB file size, the drive offered consistent performance peaking at about 100MB/s reading data and 90MB/s writing. Comparing this to the tests of the reference Seagate hard-disk drive, it was consistently more than twice as fast as the traditional drive peaked at 40MB/s (HDD benchmark charts are provided at the end of the article). Boot times are not a terribly relevant or accurate way to gauge a computer’s performance, but since gamers/nerds are always clamoring for them, I’ll include them anyways. With the SSD, a the laptop booted to the Windows login screen in 34 seconds and at the desktop with all startup items loaded in a total of 39 seconds. With the HDD, those same tasks were completed in 46 seconds and 58 seconds respectively. Both of these tests were with the same drive image running Windows Vista on an Intel Core 2 Duo 2.4Ghz notebook computer.
As for the SLC vs. MLC debate referenced earlier, it’s all a matter of cost. The best performing SSDs on the market are SLC drives. SLC drives offer better performance, lower power consumption and a longer-lifespan (100,000 write/erase cycles per sector as compared to 10,000 cycles on an MLC drive). MLC (multi-level cell) drives are cheaper to manufacture and are quickly becoming popular because of the lower price point. The lifespan argument loses its utility when one takes into account that 10,000 write/erase cycles is averaged/leveled out through the drive’s own firmware so the same cells aren’t constantly being rewritten (and since SSDs have a near-instant access time, there is no ill-effect on performance). Also, the useful life of a consumer notebook computer is surely less than that of the drive. In either case, a solid-state disk can greatly enhance the performance and battery life of a notebook, but it does come at a hefty cost.
Seagate HDD Benchmarks