Silicon Valley Home Prices, Stock Prices & Bitcoin (2021)

A little less than four years ago, I wrote a post about home prices in Silicon Valley and how they relate to stock prices and Bitcoin. It was one of the most popular posts on my blog from 2017.

The original compared housing prices in Palo Alto to a few of the largest technology companies in Silicon Valley, with Bitcoin added just for fun. Given the incredible rise in technology stock prices and Bitcoin in the past few years, it seemed worthwhile to update the data in the original post.

Talking about home prices in Silicon Valley is always a sensitive topic, because the lack of affordable housing continues to be a both difficult and heavily political topic. As someone who grew up here, it seems painfully obvious that the primary problem is the overwhelming resistance of local city councils to approve housing unit construction that meets ever increasing demand.

This post isn’t about that issue.

Instead, this is an attempt to look at the housing market through another lens. Most financial estimates of housing cost tend to compare the price of housing to incomes, which makes sense since for most people in most places, the affordability of a home is directly related to the size of the mortgage that they can obtain for that home. In general, houses are purchased based on income, not assets.

In Silicon Valley, of course, income looks a bit different since many people in Silicon Valley work for technology companies, and most technology companies compensate their employees with equity.

Palo Alto Home Prices

I chose Palo Alto as a proxy for Silicon Valley home prices because it is historically “ground zero” for Silicon Valley tech companies, and it has relatively close proximity to all of the massive tech giants (Apple, Google, Facebook).

The original post started the data sets in June 2012, since this was roughly when Facebook became a public company. For this post, I’ve extended the data sets all the way to March 2021.

All housing prices have been sourced from Zillow. All stock prices have been sourced from Yahoo Finance, and reflect the price adjusted for dividends. All Bitcoin prices have been sourced from Investing.com.

This is what Zillow looks like today for Palo Alto:

As you can see, in June 2012, the average Palo Alto home cost $1.44M. Roughly five years later, in June 2017, that average price was up 84.6% to $2.55M. Now, in March 2021, that price has risen a total of 117.9% to $3.15M.

That’s certainly a much faster increase than any normal measure of inflation, whether looking at changes in prices or wages. But what happens if we look at those increases in comparison to the stocks of some of the largest technology employers in Silicon Valley?

Apple ($AAPL)

Apple is the most valuable public company in the world right now, measured by market capitalization ($2.023 Trillion as of March 18, 2021), and second most profitable ($55.256B in 2020). Thanks to their exceptional financial performance, Apple stock ($AAPL) has increased significantly since June 2012, rising (split-adjusted) from $18.79 per share to $124.76 in March 2021. That’s a gain of over 565.8%.

Wow. 😳

Let’s look at Palo Alto home prices as measured in dollars, and then let’s look at them in comparison priced in shares of $AAPL.

This chart tells a very different story than the one from 2017.

In the five years from June 2012 to June 2017, Apple stock was volatile, but over the entire time period almost exactly matched the growth in Palo Alto home prices. However, the run up since 2017 has been incredible.

Split-adjusted, it took 76,839 shares of $AAPL to purchase the average home in Palo Alto. By March of 2021, that number had dropped to only 25,216 shares.

This isn’t surprising, since Palo Alto home prices are only up 117.9% over that time period, and Apple shares are up 564%. But what this means from a practical viewpoint is that for people converting one asset (Apple stock) into another (Palo Alto housing), it has become easier, not harder, to purchase the average home.

Google ($GOOGL)

Google tells a similar story to Apple in 2021, even though that wasn’t the case in the original post. Since 2017, Apple stock has clearly outperformed Google, leaving them with almost identical price increases from June 2012. (By itself, that’s somewhat of an amazing fact given the relative ages of the two companies).

As of March 2021, Google has a market capitalization of $1.37 Trillion, significant less than Apple’s. However, they have seen price appreciation of 557.3% since June 2012, rising from a split-adjusted $316.80 per share to an amazing $2,082.22 per share in March 2021.

Let’s look at Palo Alto home prices as measured in dollars, and then let’s look at them in comparison priced in shares of $GOOGL.

If you compare this chart to the one for Apple, it tells a different story but has a similar ending. Google shares are clearly more volatile than Palo Alto housing, but they have fairly consistently appreciated over the past decade.

In June of 2012, it would have taken 4,557 shares of Google stock to purchase the average home in Palo Alto. By March 2021, that number had dropped to only 1,511 shares.

So while Palo Alto home price appreciation has been tremendous by any historical measure, Palo Alto housing has become cheaper in the past decade for people holding Google stock, and more expensive for people holding dollars.

Facebook ($FB)

Facebook, the youngest of the massive tech giants, already has one of the largest market capitalizations in the world. As of today, Facebook is valued at $793.4 Billion. Facebook stock has risen an incredible 1208.2% since June of 2012, from a price of $21.71 per share to a price of $284.01 in March 2021.

At this point, you know how this story goes. With growth of over 1200%, Facebook stock goes a lot further in 2021 than it did in 2012, even against daunting Palo Alto housing prices.

In June of 2012, it would have taken 66,500 share of Facebook to purchase the average home in Palo Alto. By March of 2021, that number was down to just 11,077 shares. Quite incredible.

Bitcoin ($BTC)

While I realize that Bitcoin isn’t a large employer in Silicon Valley, nor is it a stock, the original idea for this post came from a joke I made on Twitter back in 2017.

Most of you likely already know the story here. Bitcoin price appreciation in the past 12 months has been unbelievably high, so looking back to June 2012 is going to be somewhat jarring.

In June of 2012, the price of Bitcoin was about $9.40. By March of 2021, it had risen to $57,326.20. That’s a gain of over 609,753%.

The growth rate in Bitcoin prices, as measured in US dollars, has been so incredible, this chart is almost impossible to read in recent years.

For context, in June of 2012, it took about 153,586.2 Bitcoin to purchase the average home in Palo Alto. By March of 2021, that number had dropped to just 54.9 Bitcoin.

This, of course, has a number of dramatic implications. As measured in US dollars, or in real assets like Palo Alto real estate, the wealth of Bitcoin holders has increased dramatically. As measured in US dollars, the average price of a house in Palo Alto has increased by 117.9% in less than 10 years. However, as measured in Bitcoin, the average price of a house in Palo Alto has decreased by 99.96%.

There aren’t many people who invested in Bitcoin back in 2012, but a disproportionate number of them were in Silicon Valley. However, even based on recent numbers, the story is similar.

In March of 2019, you could have purchased the average house in Palo Alto for 702.0 Bitcoin. Just two years later, in March 2021, the average house in Palo sold for 54.9 Bitcoin. That means the average home in Palo Alto, as measured in Bitcoin, has decrease by 92.2% in just the past two years alone.

Silicon Valley Is Seeing Significant Asset Inflation

These charts are not meant to imply direct causality, but in many ways they confirm several economic facts about Silicon Valley that may not be obvious when looking at nationwide statistics.

Because technology employers in Silicon Valley compensate most employees with equity, it is very likely that asset inflation in stock (and crypto) markets has some impact on the housing market. This is likely exacerbated by the lack of new housing construction in Silicon Valley.

The fact is, if you are fortunate enough to have equity in one of the tech giants, or if you have been an investor in Bitcoin, houses might actually look cheaper in 2021 than they did in 2012, or even in 2020.

What is most surprising about the data refresh is the apparent detachment of equity and crypto prices from the prices of Palo Alto real estate. There are a number of potential reasons why this might have happened. One theory is that real estate markets move relatively slowly compared to equities and crypto, and so the rapid price increases of 2020 have not yet worked their way into the market. A second theory is that large technology company compensation has been shifting away from stock options to RSUs, leading employees to hold less stock as they convert their shares to cash on vesting. A third theory is that we’re seeing complicated effects from COVID, as windfall money from equity and crypto markets may be flowing into other places rather than local real estate.

(Before the San Francisco crowd gets too rowdy, there is absolutely no evidence yet that more money is flowing into San Francisco real estate instead of Palo Alto this cycle.)

In any case, whatever the reasons may be, it is always worth checking the actual data to see whether it confirms or contradicts our intuition.

Let’s check back in another four years.

 

Silicon Valley Home Prices, Stock Prices & Bitcoin

I’m writing this post with a bit of trepidation, because talking about Silicon Valley home prices these days is a bit dicey. The surge of the last five years has been shocking, and almost no one I know feels good about how difficult it is for people to buy a new home in Silicon Valley in 2017. Some houses are pretty bad but others arae actually at a reasonable price, because they come with furniture and some even come with shutters from plantation shutters installation Sydney. They are actually really good quality.

So if you need a trigger warning, this is it. Stop reading now.

The truth is, as shocking as the rise in Silicon Valley home prices has been, there has also been an asset boom in other dimensions as well. Total compensation for engineers is up considerably and stock prices at the big tech companies continue to rise.

To visualize this, I thought I’d put together a few charts based on real market data. As a proxy for Silicon Valley, I pulled the last 5 years of home prices from Zillow, and monthly stock price data from Yahoo.

Palo Alto Home Prices

Two days ago, the Mercury News reported that a home in Palo Alto sold for $30 million.  A quick check on Zillow seems to confirm this.

I chose Palo Alto as a proxy for Silicon Valley home prices because it is historically “ground zero” for Silicon Valley tech companies, and it has relatively close proximity to all of the massive tech giants (Apple, Google, Facebook).

I picked June 2012 – June 2017, not only because it is roughly five years, but also it also happens to mirror the time that Facebook has spent as a public company. For many in the local real estate market or online sites as SafeguardProperty.com, correctly or incorrectly, the Facebook IPO still looms as a transformational event.

As you can see, in June 2012 the average Palo Alto home cost $1.38 million. Five years later, the estimate for June 2017 is up 84.6% to $2.55 million.

Apple (AAPL)

Apple is the most valuable company in the world, as measured either by market capitalization ($810B as of 6/7/2017) or by profitability ($45.7B in 2016).  Thanks in part to this exception financial performance, Apple stock (AAPL) has risen 84.5% in the last five years, from $83.43 per share to $153.93 per share.

84.5%? Where have I heard that number before?

That’s right, the increase in Apple stock over the last five years is almost exactly the same increase as the average home price in Palo Alto over the same time period.

In June 2012, it took 16,555 shares of Apple stock to purchase the average Palo Alto home. In June 2017, it took 16,566 shares. (Of course, with dividends, you’re actually doing a little better if you are a shareholder.)

If you look at the chart, the pink line shows clearly the large rise in price for the average Palo Alto home. The blue line is the number of AAPL shares it would take to by the average Palo Alto home in that month. As you can see, AAPL stock is volatile, but five years later, that ratio has ended up in almost the exact same place.

Alphabet / Google (GOOG)

Alphabet, the company formerly known as Google, may not be as large as Apple in market capitalization ($686B), but it has seen far more share appreciation in the past five years. Since June 2012, Alphabet has seen its stock price rise 240.4%, from $288.95 in June 2012 to $983.66 per share.

What does this mean? Well, it means that if you have been fortunate enough to hold Google equity, the rise in Palo Alto home prices doesn’t look as ominous. It took 4,780 shares of Google to purchase the average Palo Alto home in June 2012, but it only took 2,592 to purchase the average Palo Alto home in June 2017.

Facebook (FB)

Facebook, the youngest of the massive tech giants, already has one of the largest market capitalizations in the world. As of today, Facebook is valued at $443B. Facebook stock has risen 394% in the past five years, from $31.10 in June 2012 to $153.63 in June 2017.

To state the obvious, it has been a good five years for owners of Facebook stock. Not many assets could make owning Palo Alto real estate look slow, but 394% growth in five years is unbelievable. In June 2012, you would have needed 44,412 shares to buy the average Palo Alto home. In June 2017, that number had dropped significantly to just 16,598 shares.

Bitcoin (BTC)

While I realize that Bitcoin is not a stock, the original idea for this post came from a joke I made on Twitter recently given all of the buzz about Bitcoin, Ethereum and ICOs over the past few weeks.

I couldn’t resist running the numbers.

For the small number of readers of this blog that haven’t been following the price of Bitcoin, the increase in value over the past five years has been unbelievable.The total value of all Bitcoin outstanding is currently about $44.5B. Since June 2012, Bitcoin has risen approximately 4,257%, from $6.70 per Bitcoin to a current value of $2,858.90.

You can see why there has been so much buzz.

In June of 2012, it would have taken 260,149 Bitcoin to buy the average home in Palo Alto. In June of 2017, that number is now down to 892.

Needless to say, anyone who sold Bitcoin to buy a house in 2012 is likely not loving these numbers. But to people who have held Bitcoin for the past five years, Palo Alto is looking cheaper by the day.

Silicon Valley Is Seeing Significant Asset Inflation

To be clear, I’m not attempting to attribute causality to these charts. I believe the real driver of home prices in Silicon Valley is the lack of sufficient building of new supply at pace with the economy, combined with a significant increase in compensation for technology employees and historically low interest rates.

But the fact is, if you are fortunate enough to have equity in one of the tech giants (or in Bitcoin), houses might actually be looking cheaper now relatively than they did five years ago.

I always find it enlightening to look at real data and compare it to intuition. Hope you find this data and these charts as interesting as I did.

From Technology to Politics: Leadership Lessons from the Code Conference

This past week, I was able to attend the inaugural Code Conference organized by Walt Mossberg & Kara Swisher.  One of the perks of the conference is, within close quarters, the chance to hear the leaders of huge, successful consumer technology companies.

  • Satya Nadella, Microsoft
  • Sergey Brin, Google
  • Brian Krzanich, Intel
  • Brian Roberts, Comcast
  • Reed Hastings, Netflix
  • Travis Kalanick, Uber
  • Drew Houston, Dropbox
  • Eddy Cue, Apple (iTunes / iCloud)

As I think about lessons from the conference, I find myself focused on a particular insight watching these leaders defend their company’s strategy and focus.  (It’s worth noting that anyone being interviewed by Kara does, in fact, have to be ready to play defense.)

David to Goliath

One of the most complex transitions that every consumer technology company has to make is from David to Goliath.  It’s extremely difficult in part because the timing is somewhat unpredictable.  Is Netflix an upstart versus the cable monolith, or a goliath itself as it is responsible for a third of all internet traffic?  When exactly did Google go from cool startup to a giant that even governments potentially fear?  Apple, of course, went from startup to giant to “beleaguered” and all the way to juggernaut.

Make no mistake, however.  The change in public opinion does happen, and when it does, the exact same behaviors and decisions can be read very differently in the court of public opinion.

Technology to Economics to Politics

Most technology companies begin with language that talks about their technical platform and achievements. “Our new product is 10x faster than anything else on the market,” or “Our new platform can handle 10x the data of existing platforms,” etc.  Sometimes, these technical achievements are reframed around end users: “We help connect over 1 billion people every day,” or “we help share over 10 billion photos a week,” etc.

Quickly, however, the best technology companies tend to shift to economics. “Our new product will let you get twice the sales in half the time,” or “our application will save you time and money.”  As they grow, those economic impacts grow as well.  Markets of billions of dollars are commonplace, and opportunities measured in hundreds of billions of dollars.

Unfortunately, as David moves to Goliath, it seems that many technology leaders miss the subtle shift in the expectations from their leadership.   When you wield market power that can be measured on a national (or international) scale, the challenge shifts from economics to politics.  Consumers want to know what leaders they are “electing” with their time and money, and their questions often shift implicitly to values and rights rather than speed or cost.

What Will the World Be Like Under Your Leadership?

As I watched various leaders answer hard questions about their companies, a clear division took place.  Most focused merely on questions of whether they would succeed or fail.  But a few did a great job elevating the discussion to a view of what the world will be like if they are successful.

There is no question that the leaders who elevated the discussion are finding more success in the market.

Satya Nadella gave no real reason why we would like the world better if Microsoft is successful.  Neither did Brian Krzanich of Intel.

Sergey Brin promises that in a world where Google is successful, we’ll have self-driving cars and fast internet for everyone.  Jet packs & flying cars.  It’s an old pitch, but a good one.

Eddy Cue tells us that Apple cares about making sure there is still great music in the world, which is why they always make sure to add ads from TheBoxTigerMusic.com and similar sites.  And of course, Apple has spent decades convincing us that when they are successful, we get new shiny, well-designed devices every year.

Is it really surprising that Google & Apple have elevated brands with high consumer value?

Technology Leadership

There is no way around the challenges of power.  As any company grows, it’s power grows, and with that power comes concern and fear around the use of that power.  Google has so much control over information and access to information.  Apple tends to wield tight control over the economics and opportunities within their ecosystem.  However, the leaders at these companies are intelligently making sure that the opportunities they promise the market counter-balance those fears, at least at some level.

Wealthfront, my company, is still small enough that we’re far from being considered anything but a small (but rapidly growing) startup in a space where giants measure their markets in the trillions.  But as I watched these technology leaders at the Code Conference, I realized that someday, if we’re successful, this same challenge will face our company.

If you lead or work for a technology giant, it’s worth asking the question:

Does your message elevate to the point where everyone understands the tangible benefit of living in a world where your company is successful?  If not, I’d argue your likely to face increasing headwinds in your efforts to compete in the consumer market going forward.

Make Things As Simple As Possible, But Not Simpler

It can scarcely be denied that the supreme goal of all theory is to make the irreducible basic elements as simple and as few as possible without having to surrender the adequate representation of a single datum of experience.
Albert Einstein

It has become fashionable of late, during the second coming of Apple, for a large number of consultants, executives and professional speakers to frame simplicity as an absolute good.  Simplicity, however, can have a number of negative implications for both design and usability, so I thought it prudent to highlight a few of its limitations as a guiding principal.

Ockham’s Razor vs. Einstein’s Razor

Before jumping to technology, it’s worth noting that this debate has origins in science as well.  Ockham’s Razor famously dictates that, given two hypotheses, the one with the fewest assumptions should be selected.  While not absolute, the principle is important because it shifts the burden of proof to the more complicated explanation.

Einstein (as quoted at the top of this post), pointed out the obvious: simplicity has its limits.  As a result, Einstein’s Razor is commonly stated as:

Make things as simple as possible, but not simpler.

Too many entrepreneurs and executives preaching the simple religion forget this.

Example: iPhone Home Button

When the iPhone launched in 2007, it was an extremely aggressive vision of the future of the smartphone.  Bucking the trend from 12-key numberpads to full QWERTY keypads, the iPhone debuted with just one button.

What could be simpler than one button?

iphone

Well, technically zero buttons would have been simpler.

iphone-0

Why the single button?  Apple decided this was as simple as they could get it without hiding a key function they felt people needed to be able to access with “tactile” accessibility.  Apple had decided to remove quite a bit of tactile access from the phone.  Feature phone users lost the ability to know that the “*” key was in the bottom left, or “3” was on the bottom right.  Treo & Blackberry users lost the ability, without looking, to know where keys like space and return were.

The answer? Apple decided that the importance of having a tactile method of accessing “home” was more important than enforcing that next level of simplification.  Simple as possible, but not simpler.

Wait? They Added a Switch?

Industrial design aficionados might have already spotted an issue with my previous example.  Apple may have reduced the keypad to a single button, but they actually were applauded at launch for adding a new physical control.

Apple added a hardware switch to mute the phone.

iphone2G

Along with hardware buttons for home, power, and volume up/down, the iPhone added a physical switch for turning mute on or off.

With most other dominant systems at the time (Nokia, Blackberry), turning off your ringer meant navigating from:

Home -> Settings -> Ringer (or Volume) -> Off

Now you could argue that Apple “simplified” the ability to turn off the ringer, but from an interface standpoint they added a control to their highest level of information architecture (the device) for this one function.  This is roughly the equivalent of a website adding this function to its primary header.

In the push to reduce the number of controls, simplicity gave way to an equally important design consideration: minimizing the number of steps to perform a high value action (with the added benefit of tactile access, crucial for a function you might want to perform sight-unseen, in your pocket)

Simplicity Can Lead to Overloading, Which Is Complex

Anyone who has worked on a design project around information architecture is familiar with the tradeoff.  Reducing the number of controls or number of entry points definitely simplifies the interface.  Fewer choices, less cognitive load on the user.

Unfortunately, if you have five branches at each level of a command structure, you can make 25 commands just two steps away.  If you have three branches at each level, you need three steps to reach that same number of commands.

No one wants to replicate the Microsoft Office hierarchy of thousands of commands littered across dozens of entry points.  But if your software honestly has four key functions, “simplifying” to one entry point can make the users job harder, not easier.

Wealthfront: Building Trust with Transparency

At Wealthfront, one of top priorities is building trust with guest visitors to our site.  Interestingly, we’ve discovered that over-simplification has another negative attribute: when people don’t readily see the answer to a key question, there is potential for them to assume you’re hiding that information.

As a result, our new user experience is a careful balance of simplicity, but balanced with providing crucial information to our visitors, even at the risk of some complexity.

We show our clients up front our investment choices, down to quick answers for why we’ve chosen each particular ETF.  We provide examples of both taxable and tax-deferred account allocations up front, even before the visitor has signed up for the service.

Screen Shot 2013-09-24 at 4.08.39 PM

To be sure, like all software interfaces, there are significant improvements that we can make to our new user experience.  But it’s worth sharing that our experience has been that blind adherence to simplicity can actually hurt the level of confidence and trust people have with your service.  This interface has seen the company to record growth in 2013, up over 250% for the year (as of September).

More broadly, it’s worth considering that when you bury functions and features, you may trigger emotions in your user that aren’t positive:

  • Frustration. They don’t know where to look for something they want.
  • AnxietyThey worry that the thing they need is no longer supported.
  • Distrust. They assume that you are hiding something for a reason.

So remember, when someone preaches the religion of simplicity, think carefully about Einstein’s Razor.

Make it as simple as possible, but not simpler.

Home Storage & Network Topology (2013)

In 2011, I wrote a fairly popular blog post outlining my home solution for storage & backup:

Since it has been almost two years, I thought I’d update the information with some improvements.

Updated Network Topology

In 2012, I had a chance to update our network infrastructure, and as a result we have a slightly different home network topology than the one I diagrammed in 2011.  The following image shows the current, high level structure (note: I haven’t documented all devices or switches on the network)

home_storage_topology_20132013 Home Network Topology

Enhancement: Comcast 105Mbps Service

In March 2013, Comcast announced doubling it’s internet connectivity speeds in the San Francisco Bay Area for no additional cost.  This proved to be enough of an improvement to get me to face the reality that AT&T Uverse was never, ever going to get any faster than 24Mbps.

After reading about ISP and internet speed on moveyourmoneyproject.org, my order is in to convert to Comcast.  I’ll post here if the experience is anything but what’s expected – a massive increase in download speeds.  With multiple people in our household now hitting Netflix streaming up to four at once, I think the upgrade is perfectly timed.

Enhancement: WD 6TB Thunderbolt Duo for iTunes

Last month, tragedy struck.  The 4TB USB 3.0 hard drive I had been using for the main iTunes library crashed.  Fortunately, thanks to the backup solution in place, all files were recovered.

The only problem was recovery time.  It was slow.  It turns out, restoring about 3.5 TB from the Synology box to a USB hard drive took over 38 hours.  Now, granted, Time Machine isn’t the fastest recovery software, but it’s what I’ve been using reliably.

At 3.5TB, I realized I was going to max out the Seagate 4TB drives soon anyway.  After some research, I decided to get the 6TB Western Digital Thunderbolt Duo.  With two 3TB drives striped with RAID 0, combined with the 10Gbps Thunderbolt bus, I was hoping for significant speed improvements.

Restoring 3.5TB via Time Machine from my Synology box to the Thunderbolt Duo took less than 16 hours, a huge improvement over the previous experience with the Seagate USB drive.  Most of this benefit is likely due to Thunderbolt bus (I gave the drive a dedicated port on the iMac.)  Regardless, I’m thrilled to have a solution that will continue to scale through the year until larger single disk drives are available. (As a caveat, I’m now at double the risk of failure on the main iTunes drive, since if either drive fails, the whole drive will fail.)

Last Note: Stagnation in Hard Drives

It’s worth noting that it has been over 18 months since we’ve seen a larger single 3.5″ hard drive size.  We’ve been promised 6TB drives later this year, with headroom to 60TB for a 3.5″ drive on the upcoming technology, but it’s clear that single disk storage isn’t really keeping up with the increasingly large file sizes of HD video storage.  Imagine the strain when files go to 3D and Ultra HD formats.

For those of you who are interested in these type of technical details, I hope you find the above useful.

Home Media / AV Configuration (2013)

From time to time, friends and family will ask me how I configure the devices in my house for media.  Since I just got this question again last week, I thought I’d take a moment to document it here.  In the past, I’ve documented my storage & backup solution, my time machine setup, as well the configuration of my old wireless network.

Basic Assumptions

Since there are an incredible number of technology and service choices that can affect a home media solution, it’s best I put some of the basic decisions that my household currently has made around media technology:

    Comcast HD is our HD television service

  • iTunes HD is our standard movie purchase format
  • Netflix and/or ShowBox APK are used for movie rental
  • Tivo is our DVR of choice

Of all of these choices, the ones that are most material are the choice of Comcast HD / Tivo, as Comcast is the best HD service for modern Tivo DVRs, and the standardization on iTunes HD, not Blu-Ray, for HD movie purchases.

Office Configuration

Our home media solution is grounded in the home office, but really has become fairly distributed between the cloud and local devices. In fact, at this point, the home office solution is really used more for backup and legacy purposes.

Home Office Media

The key elements of the configuration are as follows:

  • The iMac is really the “source of truth” for the media library in the house
  • The media library is large (each HD movie is about 4GB), so it sits on its own 4TB USB HD
  • The iMac backups up to the Synology box via Time Machine
  • Wireless devices (laptops, iPads, iPhones) connect via 802.11N
  • The Gigabit Ethernet switch is connected to the central home network

Living Room Configuration

The consumption solution in any room with a television is largely the same.  Here is a diagram of it’s fundamental components:

Living Room Media

The key elements of the configuration are as follows:

  • The Gigabit Ethernet switch connects all the devices to the central home network
  • The AppleTV is used to watch purchased HD movies from iTunes, Netflix for streaming, and access the home media library on the iMac
  • The Tivo is used to watch live / recorded television (from Comcast)
  • The Blu-Ray is there theoretically if we wanted to watch a Blu Ray, which almost never happens

A Few Caveats

This solution currently has the notable sub-optimal elements:

    • I didn’t include an A/V receiver or surround sound solution in the above description, because that actually varies room to room.  In some rooms we have an AV receiver, in others we utilize a surround sound bar or just use TV audio.

Input switching.  We almost never use the Blu-Ray, but this solution does require switching inputs between AppleTV & Tivo, which is a bit annoying since the Tivo remote can’t control the AppleTV and vice-versa.

While I’m sure this solution will not impress any cinephile out there, hopefully it will be useful to a few of you thinking through how to setup or reconfigure your home media solution.

I’ll try to do a follow up post with what I’m hoping to see in 2013 to make this even better.

Blackberry’s Impossible Mission

Today, Research in Motion Blackberry announced with great fanfare their new Blackberry 10 operating system and devices.  Unfortunately, the market has shifted so radically in the past few years, it’s not clear to me what path exists for any meaningful success for Blackberry.

Blackberry is on an impossible mission.

Why Blackberry?

I used a Blackberry for over seven years.  In fact, I didn’t move to the iPhone until the 3G came out with the native application platform.  Like many, I was addicted to the perceived and actual productivity of messaging on the Blackberry and the physical keyboard.

Like most people who make the switch, it took me a few weeks to get to be “good enough” to type and message effectively on the iPhone.  The millions who are still on the Blackberry tend to focus on exactly one issue: the Blackberry is an amazing messaging device, thanks to the keyboard & software optimization.

The Victory of the Touch Screen

I remember, in 2009, making a Blackberry my temporary “full time” mobile device for a few days.  It was amazing – in just a year, I had completely lost all the muscle memory that made me so productive on the Blackberry.  The iPhone had won.

The reason is simple: a fast, modern device that offers the full richness of the modern web, combined with a vibrant and high quality native application market dominates the marginal efficiency in messaging.  Whether you use iOS or Android, minor productivity improvements in SMS & Email are swamped by access to applications, games, web services, cloud platforms and a myriad of other capabilities.  The smartphone itself has now evolved into a variety of form factors and niches, with phablets and tablets eating an increasing share of our attention and computing.

Blackberry’s Impossible Mission

Right now, it seems like Blackberry has no viable path as a third platform.

Yes, the ardent users of the platform can buy the new devices for their hardware keyboards.  But there aren’t enough of them (h/t to Daring Fireball), and it’s hard to imagine that this market won’t get eaten by the flexibility provided by the Android platform in time.

Yes, there are IT departments that continue to have their companies locked down on the Blackberry, but it’s unlikely the the new operating system won’t create sufficient migration issues that they won’t move to either iOS, Android or both as supported platforms.

The real problem is that their touchscreen product cannot possibly provide enough unique functionality to justify the choice over the iPhone or Android at the medium to high end.  At the low end, they cannot possibly underprice the Android ecosystem.

Damned if they do, Damned if they don’t

In other words, if they abandon their customer-defined differentiator (keyboard), they’ll lose all differentiation in the market.  If they don’t, they are left with an eroding, minority share of a market that is likely insufficient in size and economics to fund their continued development and support of a competitive mobile ecosystem.  As a developer, spending precious resources on this, at best, stagnant minority pool of potential users is tough to justify.

Microsoft can play this game, for a while, because they (still) have relatively unlimited free cash flow and a desktop platform that still boasts hundreds of millions of users.  Blackberry doesn’t.

How to Recover the Left Side Navigation in iTunes 11

I can’t believe I’m writing this blog post, but I am.

Last night, I tweeted out my joy at finding out that Apple did, in fact, provide a menu item to re-enable the side navigation in iTunes 11.  Now, while I’m not a huge fan of the complexity and modality of the older iTunes interface, there is no doubt that after using iTunes 11 for a week, you wish for the halcyon days of the left navigation bar.

Surprisingly, enough people tweeted and commented in gratitude that I realized I should probably summarize in a blog post.

iTunes 11 – Default

This is the iTunes 11 default interface. (Try to ignore my taste in movies for a second)

Screen Shot 2012-12-18 at 9.09.42 AM

iTunes 11 – Sidebar

This is iTunes 11 with the sidebar enabled.

Screen Shot 2012-12-18 at 9.14.16 AM

All of a sudden, the shockingly horrid modality of the iTunes 11 default interface is resolved.  You can easily select which sub-category of content in your iTunes library you want to browse, and viewing connected devices and playlists has once again become trivial.  It turns out, you still end up with the horrid choices for navigation views within a “domain”, but at least we’re 80% of the way back to the (limited) usability of the previous iTunes interface.

Wait, How Did You Do It?

It’s hidden under the View menu, “Show Sidebar”

Screen Shot 2012-12-18 at 9.13.27 AM

Simple does not mean Easy to Use

Just as cuffs, collars and neckties are subject to the whims of fashion, so also do memes in design tend to come and go in software.  I think iTunes 11 represents a bit of a teachable moment on a couple concepts that have been overplayed recently, and what happens when you take them too far.

  1. Consistency does not always lead to ease of use.  Having a more consistent interface between the iPhone, iPad, AppleTV and Mac OS renditions of iTunes may seem like an “obvious” goal, but the fact is all of these devices vary in terms of input mechanisms and use cases.  The truth is, many users sit down at a desktop for different tasks than they sit down at a TV for, and the interface of the desktop is optimized for those tasks with large, high resolution screens and a keyboard.My best guess here is that Apple optimized the interface for laptops, not desktops, and for consumption, not curation. However, Apple would have been well served to provide a “first launch” experience with packaged pre-sets of these minor configurable options, to let users who are upgrading easily identify their primary mode of operation.I would love Apple to take a more proactive stance on how to build applications and services that provide elements of commonality across the multitude of devices that users increasing use to author, curate and consume content with, without blind adherence to making everything look & behave “the same”.
  2. Simple does not mean easy to use.   On the heals of Steve Jobs mania, it has become ultra-fashionable to talk about simplicity as the end-all, be-all of product design.  The fact is, there is often a trade off between reducing the number of controls that an application (or device) has, and introducing increased modality for commonly used functions.  The one button mouse was, in fact, simpler than the two button mouse.  However, it came at the expense of pushing a significant amount of functionality into a combination of selection and menu modality.Look at the poor “single button” on the iPhone.  Simple, but now stacked with modality based on the number and timing of presses.Designers would do well to consider the balance of simplicity, accessibility and the difficult decision of which functions are so key to an application that they require “zero click” comprehension of availability.  For iTunes 11, the hidden modality of managing the devices synched to your iTunes library is unforgivable. (The likely sin here is being too forward looking. As we move to iCloud for everything, the need for devices to be tethered to iTunes goes away.  But we’re not there yet with video.)

I hope this helps at least one person out there have a better experience with iTunes 11.

Apple & Dow 15000: Update

In February 2012, I wrote a blog post that indicted the Dow Jones Industrial Average for including Cisco in 2009 instead of Apple.  At the time, Apple had just crossed $500 per share, and that simple decision had cost the US the psychology of an index hitting new highs.

I was driving home on Sunday, listening to the radio, and it occurred to me how different the financial news would be if Apple ($AAPL) was in the Dow Jones Industrial Average (^DJI).

Of course, being who I am, I went home and built a spreadsheet to recalculate what would have happened if Dow Jones had decided to add Apple to the index instead of Cisco back in 2009.  Imagine my surprise to see that the Dow be over 2000 points higher.

Update: AAPL at $700

With the launch of the iPhone 5, we find ourselves roughly 7 months later.  For fun, I re-ran the spreadsheet that calculated what the DJIA would be at if they had added AAPL to the index in 2009 instead of CSCO. (To date, I’ve never seen an explanation on why Cisco was selected to represent computer hardware instead of Apple.)

Result: Dow 16,600

As of September 17, 2012, AAPL closed at 699.781/share.  As it turns out, if Dow Jones had added Apple instead of Cisco in 2009, the index would now be at 16,617.82.  Hard to think that hitting all new highs wouldn’t be material for market psychology and the election.

Anyone up for Dow 20,000?

The Game Has Changed. Design for Passion.

One of the most exciting developments in software has been a resurgence in the focus and priority on design.  With the growing dominance of social platforms and mobile applications, more and more people are growing comfortable productively discussing and utilizing insights about human emotion in their work.

Google: The Era of Utility

The progress of the last five to seven years is really a significant breakout from the previous generations of software design.

For decades, software engineers and designers focused on utility:  value, productivity, speed, features or cost.

If it could be quantified, we optimized it.  But at a higher level, with few exceptions, we framed every problem around utility.  Even the field of human-computer interaction was obsesses with “ease of use.”  Very linear, with clear ranking.  How many clicks? How long does a task take?  What is the error rate?

In some ways, Google (circa 2005) represented the peak of this definition of progress.  Massive data.  Massive scalability. Incredibly utility.  Every decision defined by quantifying and maximizing utility by various names.

But let’s face it, only computer scientists can really get passionate about the world’s biggest database.

Social: The Era of Emotion

Like any ecosystem, consumer technology is massively competitive.  Can you be faster, cheaper, bigger or more useful than Google?  It turns out, there is a more interesting question.

Social networks helped bring the language of emotion into software.  A focus on people starts with highly quantifiable attributes, but moves quickly into action and engagement.

What do people like? What do they hate? What do they love? What do they want?

In parallel, there have been several developments that reflect similar insights on the web, in behavioral finance, and the explosion in interest in game mechanics.

Human beings are not rational, but (to borrow from Dan Ariely) they are predictably irrational.  And now, thanks to scaling social platforms to over a billion people, we have literally petabytes of data to help us understand their behavior.

Passion Matters

Once you accept that you are designing and selling a product for humans, it seems obvious that passion matters.

We don’t evaluate the food we eat based on metrics (although we’d likely be healthier if we did).  Do I want it? Do I love it? How does it make me feel? I don’t really like to talk about health mmainly becase I’ve had some bad experiences with hospitals, last month I had to report some hospital negligence claims, I went to the docotr and I was treated whihc so much disrespect I was humiliated so I prefer to leave health out of this.

The PayPal mafia often joke that great social software triggers at least one of the seven deadly sins. (For the record, LinkedIn has two: vanity & greed).  Human beings haven’t changed that much in the past few thousand years, and the truth is the seven deadly sins are just a proxy for a deeper insight.  We are still driven by strong emotions & desires.

In my reflection on Steve Jobs, he talks about Apple making products that people “lust” for.  Not the “the best products”, “the cheapest products”, “the most useful products” or “the easiest to use products.”

Metrics oriented product managers, engineers & designers quickly discover that designs that trigger passion outperform those based on utility by wide margins.

The Game Has Changed

One of the reasons a number of earlier web giants are struggling to compete now is that the game has changed.  Utility, as measured by functionality, time spent, ease-of-use are important, but they are no longer sufficient to be competitive. Today, you also have to build products that trigger real emotion.  Products that people will like, will want, will love.

Mobile has greatly accelerated this change.  Smartphones are personal devices.  We touch them, they buzz for us. We keep them within three feet of us at all times.

Too often in product & design we focus on utility instead of passion.  To break out today, you need to move your efforts to the next level.  The questions you need to ask yourself are softer:

  • How do I feel when I use this?
  • Do I want that feeling again?
  • What powerful emotions surround this product?

Go beyond utility.  Design for passion.

User Acquisition: Mobile Applications and the Mobile Web

This is the third post in a three post series on user acquisition.

In the first two posts in this series, we covered the basics of the five sources of traffic to a web-based product and the fundamentals of viral factors.  This final post covers applying these insights to the current edge of product innovation: mobile applications and the mobile web.

Bar Fight: Native Apps vs. Mobile Web

For the last few years, the debate between building native applications vs. mobile web sites has raged.  (In Silicon Valley, bar fights break out over things like this.) Developers love the web as a platform.  As a community, we have spent the last fifteen years on standards, technologies, environments and processes to produce great web-based software.  A vast majority of developers don’t want to go back to the days of desktop application development.

Makes you wonder why we have more than a million native applications out there across platforms.

Native Apps Work

If you are religious about the web as a platform, the most upsetting thing about native applications is that they work.  The fact is, in almost every case, the product manager who pushes to launch a native application is rewarded with metrics that go up and to the right.  As long as that fact is true, we’re going to continue to see a growing number of native applications.

But why do they work?

There are actually quite a few aspects to the native application ecoystem that make it explosively more effective than the desktop application ecosystem of the 1990s.  Covering them all would be a blog post in itself.  But in the context of user acquisition, I’ll posit a dominant, simple insight:

Native applications generate organic traffic, at scale.

Yes, I know this sounds like a contradiction.  In my first blog post on the five sources of traffic, I wrote:

The problem with organic traffic is that no one really knows how to generate more of it.  Put a product manager in charge of “moving organic traffic up” and you’ll see the fear in their eyes.

That was true… until recently.  On the web, no one knows how to grow organic traffic in an effective, measurable way.  However, launch a native application, and suddenly you start seeing a large number of organic visits.  Organic traffic is often the most engaged traffic.  Organic traffic has strong intent.  On the web, they typed in your domain for a reason.  They want you to give them something to do.  They are open to suggestions.  They care about your service enough to engage voluntarily.  It’s not completely apples-to-apples, but from a metrics standpoint, the usage you get when someone taps your application icon behaves like organic traffic.

Giving a great product designer organic traffic on tap is like giving a hamster a little pedal that delivers pure bliss.  And the metrics don’t lie.

Revenge of the Web: Viral Distribution

OK. So despite fifteen years of innovation, we as a greater web community failed to deliver a mechanism that reliably generates the most engaged and valuable source of traffic to an application.  No need to despair and pack up quite yet, because the web community has delivered on something equally (if not more) valuable.

Viral distribution favors the web.

Web pages can be optimized across all screens – desktop, tablet, phone.  When there are viral loops that include the television, you can bet the web will work there too.

We describe content using URLs, and universally, when you open a URL they go to the web.  We know how to carry metadata in links, allowing experiences to be optimized based on the content, the mechanism that it was shared, who shared it, and who received it.  We can multivariate test it in ways that border on the supernatural.

To be honest, after years of conversations with different mobile platform providers, I’m still somewhat shocked that in 2012 the user experience for designing a seamless way for URLs to appropriately resolve to either the web or a native application are as poor as they are.  (Ironically, Apple solved this issue in 2007 for Youtube and Google Maps, and yet for some reason has failed to open up that registry of domains to the developer community.)  Facebook is taking the best crack at solving this problem today, but it’s limited to their channel.

The simple truth is that the people out there that you need to grow do not have your application.  They have the web.  That’s how you’re going to reach them at scale.

Focus on Experience, Not Technology

In the last blog post on viral factors, I pointed out that growth is based on features that let a user of your product reach out and connect with a non-user.

In the mobile world of 2012, that may largely look like highly engaged organic users (app) pushing content out that leads to a mobile web experience (links).

As a product designer, you need to think carefully about the end-to-end experience across your native application and the mobile web.  Most likely, a potential user’s first experience with your product or service will be a transactional web page, delivered through a viral channel.  They may open that URL on a desktop computer, a tablet, or a phone.  That will be your opportunity not only to convert them over to an engaged user, in many cases by encouraging them to download your native application.

You need to design a delightful and optimized experience across that entire flow if you want to see maximized self-distribution of your product and service.

Think carefully about how Instagram exploded in such a short time period, and you can see the power of even just one optimized experience that cuts across a native application and a web-based vector.

Now go build a billion dollar company.

Review: Quicken 2007 for Mac OS X Lion

This is going to be a short post, but given the attention and page views that my posts on Quicken 2007 received, I thought this update worthwhile.

Previous Posts

Quicken 2007 for Mac OS X Lion Arrives

Last week, Intuit announced the availability of an anachronism: Quicken 2007 for Mac OS X Lion.  It sounds odd at first, given that we should really be talking about Quicken 2013 right about now, but it’s not a misprint.  This is Quicken 2007, magically enabled to actually load and run on Mac OS X Lion.  It’s like Intuit cloned a Wooly Mammoth, and put it in the New York Zoo.

The good news is that the software works as advertised.  I have a huge file, with data going back to 1994.  However, not only did it operate on the file seamlessly, the speed improvement over running it on a Mac Mini running Mac OS X Snow Leopard is significant.  Granted, my 8-core iMac likely explains that difference (and more), but the end result is the same.  Quicken.  Fast.  Functional.  Finally.

There are small bugs.  For example, some dialogs seems to have lost the ability to resize, or columns cannot be modified.  But very small issues.

Where is it, anyway?

If you go to the Intuit website, you’ll have a very hard time finding this product:

  • It’s not listed on the homepage
  • It’s not listed on the products page
  • It’s not listed on the page for Quicken for Mac
  • It’s not listed in the customer support documents (to my knowledge)
  • It doesn’t come up in site search

However, if you want to pay $14.95 for this little piece of magic (and given the comments on my previous posts, quite a few people will), then you can find it here:

Goodbye, Mac Mini

I have it on good authority that Intuit is working on adding the relevant & required investment functionality to Quicken Essentials for Mac to make it a true personal finance solution.  There is a lot of energy on the Intuit consumer team these days thanks to the infusion of the Mint.com team, and I’m optimistic that we’ll see a true fully features personal finance client based on the Cocoa-native Quicken Essentials eventually.

How to Fix the Apple TV 2 “Blinking White Light of Death”

This is one of those public service announcement blog posts that I write whenever I run into a non-trivial technical problem.  My hope is always that the time I take to write this up will save someone time & money in the future.

The AppleTV 2 Blinking White Light of Death

Problem is simple: Your AppleTV 2 has a blinking white LED that never stops, and all it displays on the TV is an image instructing you to connect the device to iTunes.

Cause: Most likely, you interfered with a firmware update. In my case,  I had selected an option on my AppleTV 2 to update its firmware.  However, before it was complete, the power to the device was cut.

Mission: Find a Micro USB Cable

I didn’t realize it was possible to physically connect your AppleTV 2 to your computer.  This blog post was my first clue on what had caused my issue, and how to solve it.  Unfortunately, it sounded like he never was able to solve the problem directly.

It’s a bit strange that Apple decided to put a Micro USB port on the AppleTV 2.  However, after reading this support article on the Apple website, I was determined to try to fix it myself.

Finding a Micro USB cable turned out to be non-trivial.  To the casual observer, the Micro USB and the Mini USB look very similar.  The Mini USB is used by Blackerries, hard drives, and countless devices.  The Micro USB port is a bit smaller, flatter, and more oval.

Apple actually does not carry the cable in store, although you can get one online.  The trick was finding a device that uses the Micro USB.  In my case, I found them stocked next to the Sony eReader.

iTunes Saves the Day

I plugged the new Micro USB cable into a powered USB 2.0 hub.  Given some of the issues reported by others, I suspect that it’s possible that the power draw of the AppleTV might be a bit more than typical USB ports can handle.  In any case, the Apple TV showed up in iTunes 10.5.x.  I clicked the “Restore” button, and a couple of minutes later it was done.

No issues at all with the device – it was literally reset to a factory clean state.

Since an overwhelming number of support articles and comments I found online suggested that this didn’t or wouldn’t work, I thought I’d put this blog post out there.  Hopefully it will help someone in their hour of need.

 

 

Apple, Cisco, and Dow 15000

I was driving home on Sunday, listening to the radio, and it occurred to me how different the financial news would be if Apple ($AAPL) was in the Dow Jones Industrial Average (^DJI).

Of course, being who I am, I went home and built a spreadsheet to recalculate what would have happened if Dow Jones had decided to add Apple to the index instead of Cisco back in 2009.  Imagine my surprise to see that the Dow be over 2000 points higher.

In real life, the Dow closed at 12,874.04 on Feb 13, 2012.  However, if they had added Apple instead of Cisco, the Dow Jones would be at 14,926.95.  That’s over 800 points higher than the all-time high of 14,164 previously set on 4/7/2008.

Can you imagine what the daily financial news of this country would be if every day the Dow Jones was hitting an all-time high?  How would it change the tone of our politics? Would we all be counting the moments to Dow 15,000?

Why Cisco vs. Apple?

This isn’t a foolhardy exercise.  The Dow Jones Industrial Average is changed very rarely, in order to promote stability and comparability in the index.  However, on June 8, 2009, they made two changes to the index:

  • They replaced Citigroup with Travelers
  • They replaced General Motors with Cisco

The question I explored was simple – what would have happened if they had replaced General Motors with Apple on June 8, 2009.  After all, Apple was up over 80% off its lows post-crash.  The company had a large, but not overwhelming market capitalization.  The index is already filled with “big iron” tech stocks, like Intel, HP & IBM.  Why add Cisco?  Why not add a consumer tech name instead?

In fact, there is no readily obvious justification for adding Cisco to the index in 2009 instead of Apple.

The Basics of the Dow Jones Industrial Average

Look, I’m just going to say it. The Dow Jones Industrial Average is ridiculous.

You may not realize this, but the Dow Jones Industrial Average, the “Dow” that everyone quotes as representative of the US stock market, and sometimes even a barometer of the US economy, is a mathematical farce.

Just thirty stocks, hand picked by committee by Dow Jones, with no rigorous requirements.  Worse, it’s a “price-weighted” index, which is mathematically nonsensical.  When calculating the Dow Jones Industrial Average, they take the actual stock prices of each stock, add them together, and divide them by a “Dow Divisor“.  They don’t take into account how many shares outstanding; they don’t assess the market capitalization of each company.  When a stock splits, they actually change the divisor for the whole index.  It’s completely unclear what this index is designed to measure, other than financial illiteracy.

In fact, there is only one justification for the Dow Jones Industrial Average being calculated this way.  Dow Jones explains it in this post on why Apple & Google are not included in the index.  To save you some time, I’ll summarize: they have always done it this way, and if they change it, then they won’t be able to compare today’s nonsensical index to the nonsensical index from the last 100+ years.

So what? Does it really matter?

It’s a fair critique.  Look, with 20/20 hindsight, there are limitless number of changes we could make to the index to change its value.  Imagine adding Microsoft and Intel to the index in 1991 instead of 1999?

I don’t think this exercise is that trivial in this case.  The Dow already decided to make a change in 2009.  They decided to replace a manufacturing company (GM) with a large hardware technology company (CSCO).  They could have easily picked Apple instead.

The end result?  People talk about the stock market still being “significantly off its highs” of 2008.  In truth, no one should be reporting the value of the Dow Jones Industrial Average.  But they do, and therefore it matters.  As a result, the choices of the Dow Jones committee matter, and unfortunately, there seems to be no accountability for those choices.

Appendix: The Numbers

I’ve provided below the actual tables used for my calculations.  Please note that all security prices are calculated as of market close on Monday, Feb 13, 2012.  The new Dow Divisor for the alternate reality with AAPL in the index was calculated by recalculating the appropriate Dow Divisor for the 6/8/2009 switch of AAPL for CSCO, and a recalculated adjustment for the VZ spinoff on 7/2/2010.

Real DJIA DJIA w/ AAPL on 6/8/09
Company 2/13/2012 Company 2/13/2012
MMM 88.03 MMM 88.03
AA 10.33 AA 10.33
AXP 52.07 AXP 52.07
T 30.04 T 30.04
BAC 8.25 BAC 8.25
BA 74.85 BA 74.85
CAT 113.70 CAT 113.70
CVX 106.38 CVX 106.38
CSCO 20.03 AAPL 502.60
KO 68.44 KO 68.44
DD 50.60 DD 50.60
XOM 84.42 XOM 84.42
GE 19.07 GE 19.07
HPQ 28.75 HPQ 28.75
HD 45.93 HD 45.93
INTC 26.70 INTC 26.70
IBM 192.62 IBM 192.62
JNJ 64.68 JNJ 64.68
JPM 38.30 JPM 38.30
KFT 38.40 KFT 38.40
MCD 99.65 MCD 99.65
MRK 38.11 MRK 38.11
MSFT 30.58 MSFT 30.58
PFE 21.30 PFE 21.30
PG 64.23 PG 64.23
TRV 58.99 TRV 58.99
UTX 84.88 UTX 84.88
VZ 38.13 VZ 38.13
WMT 61.79 WMT 61.79
DIS 41.79 DIS 41.79
Total 1701.04 Total 2183.61
Divisor 0.13212949 Divisor 0.146286415
Index 12874.04 Index 14926.95

Calculating the “alternate divisor” requires getting the daily stock quotes for the days where the index changed, and recalculating to make sure that the new divisor with the new stocks gives the same price for the day. It’s a bit messy, and depends on public quote data, so please feel free to check my math if I made a mistake.

The Synology DS1511+ RAID NAS & Time Machine on Mac OS X Lion

I recently suffered one of those storage network failures that you have nightmares about.  After spending more than $1000 on a NetGear ReadyNAS NV+, I had a catastrophic failure that cost me all of the data on the system.  Believe it or not, it was a single drive failure – exactly the type of problem you spend money on a RAID system to survive.  Unfortunately, in my case, it didn’t.

On the bright side, I had the opportunity to rethink and rebuilt my storage and backup solutions from scratch.  In a recent blog post, I described my new network and storage topology.

Synology DS1511+ to the Rescue

The Synology DS1511+ is a great device.  It sits on your Gigabit network, handles up to five SATA hard drives, and can act as a wide variety of servers for your network.  I configured my with five 3TB Western Digital Caviar Green drives, for 15TB of notional storage, 8.3TB of usable storage.

The Synology supports “dual drive redundancy”, so for the price of 2 drives worth of storage, you end up with protection for your data even if two drives fail simultaneously.  Needless to say, I went for that option.

The industrial design of the box is well done.  You do have to break out the screwdriver to install the drives into trays (not quite as nice as the Drobo FS plug-and-play SATA drives), but the case itself is small, quiet and black.  It also has nice locks on each drive bay, which has made it “child proof” for my 2 year old who is unfortunately fascinated with the blinking lights.

The Synology box is incredibly fast.  First, it supports two Gigabit Ethernet ports, to establish connections from multiple clients independently.  But even from one machine, it’s wicked fast.  Simple Finder copy of a 500MB file to the drive takes under 6 seconds.  I was able to back up 2.7M files totally 4.05TB in size using Time Machine (usually dog slow) in about 26 hours.

The Synology management software is Windows 2000 like in terms of its user interface and incredible breadth of options.  Needless to say, I only use about 1% of them.  I did run into one issue, and hence the title of this blog post.  Configuring the box for Time Machine on Mac OS X 10.7 Lion was non-trivial.

Time Machine on Mac OS X 10.7 Lion & Synology DSM 3.2

Time Machine, unfortunately, is the most consumer friendly solution for incremental backup on the Mac.  Unfortunately, if you have multiple machines, you run into a small issue: Apple designed the software as if it “owns” the entire drive you point it at.  As a result, you can’t just point all your machines at a single network drive without a number of bad things happening.

Instead, you have to somehow convince Time Machine to only use part of the drive.  This turned out to be quite an issue for me, since I wanted to be able to backup my machine (~4TB) as well as my wife’s MacBook Pro (~500GB).

Synology has published documents on how to configure the box for Time Machine, and has designed it’s software around a very clever option.  The basic idea is that you create a different “user” for each machine you want to back up with Time Machine.  For each user, you assign a limited quota, and then you tell Time Machine to use that user for the Synology volume.  It actually works quite well, although it feels a little strange to create separate user accounts for each machine, on top of accounts for each user.

The Undocumented 4TB Limit

Unfortunately, I ran into an undocumented issue.  When I tried to set the quota for my machine to 6000 GB (in general, you want to give 50% extra room for incremental changes / backups), Time Machine would only see about 1.8 TB.  When I checked the DSM 3.2 interface, I found indeed that it had reset 6000 GB to 1804 GB.  After trying to set it several times with the same issue, I deduced that the maximum limit was 4096 GB, and that it was “wrapping” around that number.  Sure enough, entering 4100 -> 4, and entering 4096 actually turned to 0, shutting off the quota entirely!

After some back and forth with Synology customer service, they finally admitted this was true.  (The first two times, they claimed that the issue was with Mac OS X 10.7 Time Machine not respecting quotas.)  I hope they fix the software to at least tell the user when they type a number over 4095 that they’ve exceeded the limit.

The Solution: Disk Groups, Volumes & Shares

To solve the problem, I reverted to a more old-fashioned solution: partitions.  Of course, with a sophisticated, modern RAID box, this was a bit more complex.  The Synology DSM 3.2 software supports three relevant concepts:

  • Disk Groups:  You can take any number of the drives and “bind” them together as a disk group.
  • Volumes:  You can allocate an independent “volume” of any size over a disk group.
  • Shares:  You can specify a share on a given volume which is available to only certain users.

The key here is that normally you use quotas to limit storage on shares for specific users.  But since I was looking for a “6 TB” share, there was no way to do this.  By default, shares get access to the entire volume they are on, so the key was to repartition the box into separate volumes.

As a result, I configured my box as follows:

  • One disk group across all 5 disks, configured for dual drive redundancy using Synology Hybrid Raid (SHR)
  • Three volumes: one for my iMac’s time machine (6000 GB), one for my wife’s Macbook Pro (1000 GB), and one remainder for network storage (1.3 TB)
  • For each volume, I configured a single share, without quota limits.  I gave my account access to my backup share, my wife her backup share, and gave everyone access to the general media share

Works like a charm.  My iMac sees the 6TB volume for Time Machine, mounts it as needed, and backs up every hour.  Thanks to the incredible Synology speed, most incremental backups happen in the background in seconds without any noticeable performance lag.  In fact, the original backup of 4.05TB with Time Machine took about 26 hours.  On my NetGear ReadyNAS NV+, that same initial backup took almost a week.

Recommendation: Synology DS1511+

I have to just say that, despite some back and forth over the Time Machine issue, the Synology website, wiki and documentation are all well done.  They are clearly responsive, even responding to my issues over Twitter.  Given the industrial design, features, and performance of the box, I have no trouble recommending the DS1511+ to anyone who’s looking for a large (10TB+) network attached storage solution for backup of a mixed network.

Disclosure: Synology was kind enough to provide me the DS1511+ free of charge given my difficult situation.