Playing “Where’s Waldo” with the LinkedIn 100 Million Photo

Yesterday, LinkedIn celebrated reaching 100 million members… an amazing milestone.

As part of the celebration, the whole team in Mountain View gathered for a photo outside of the main building:

Now the fun part… can you play “Where’s Waldo?” and find me in the picture?

It was hard not to feel good about the scope of what LinkedIn has accomplished.  This photo was an amazing reminder of how many great people are working every day help LinkedIn change the world.  In some ways, this photo is a reminder that I’m a small part of that story.

Still early days.  So much more ahead of the team than behind it.

Book Review: The 4 Percent Universe


The 4 Percent Universe: Dark Matter, Dark Energy, and the Race to Discover the Rest of Reality

It has been a while since I’ve posted a book review to this blog, but after finishing a couple new books this past weekend, I thought a few readers might be interested in this one.

The 4 Percent Universe is a fairly typical “popular” physics book, namely one of the dozen or so books that gets published every year to try and simplify modern physics for the casual reader.  Originally, I picked this book up based on a Wall Street Journal review that recommended it as an up-to-date assessment of current theory around dark matter and dark energy.

For those of you who haven’t followed the progress on these topics over the past two decades, dark matter is a the common term given to the matter in the universe that we can detect due to gravitational effect, but can’t see based on any traditional form of observation.  Dark matter, as it turns out, does not emit or react to photons, which are the basis of most forms of astronomic observation.  Dark energy is the term given for the incredibly large volume of energy that has been calculated to exist in our universe, but that once again we haven’t been able to measure.  Both are fascinating outcomes of the development of mathematical theories around cosmology that predict facets of our universe that have not yet been measured or observed.  The “4%” in the book title refers to the fact that only about 4% of our universe is actually the traditional forms of matter and energy that most of humanity assumed was “everything” through the 20th century.

What makes this book different than most is the style of writing.  Instead of a chapter-by-chapter introduction and explanation of concepts, the entire book is presented as narrative, literally walking through the individual stories of the researchers and scientists who played different roles in discovering relevant theories and concepts.  As a result, it’s a much deeper look into the politics and competitiveness between scientists and academics of different disciplines (math, physics, astronomy, cosmology), as well as the bare knuckles process of research, peer-review, and all-too-common resistance to data and/or theories that don’t conform to existing cannon.

Personally, I found the first 150 pages or so fairly boring – too far in the past for me to really engage on the play-by-play discoveries that led to an acceptance of cosmology, big bang theory, and inflation.  These are topics that Stephen Hawking covered fairly well in his books.  However, the last half of the book really drew me in, as the narrative really took over in presenting the mounting evidence for dark energy, with explanations of key experiments and theories in the past decade (as recently as 2007/2008).

As a result, I definitely recommend this book to those who fashion themselves “physics hobbyists”, or those who wish to remain up-to-date on modern cosmology.

What I Would Do with the Coke Freestyle

One of the best features of the new building that LinkedIn opened up at 2051 Stierlin Court in Mountain View is the new Coca-Cola Freestyle. The Freestyle is a modern soda fountain wrapped in a vending machine. You can order any one of over 100 different varieties of soft drink, ranging from flavored Dasani water to my personal (and discontinued) favorite: Diet Vanilla Coke.

After playing with the machine for a few months, I’ve realized that Coca-Cola is sitting on a massive marketing opportunity with his machine, if they execute it aggressively. Obviously, coming from the social web, I have a particular angle on how I would leverage this new machine in the marketplace.

How It Works

The Coke Freestyle may look like a vending machine, but it’s internals look more like a giant inkjet printer. It seems to have two types of cartridges: large bulky core soft drink syrups (e.g. Coke, Sprite, Diet Coke, etc) and smaller flavor syrups (vanilla, lime, sugar-free vanilla, etc). It combines these with a water source and carbonation on demand based with varying portions of each.

Thus, a small number of core flavors and accent flavors can deliver a truly breathtaking variety of soft drinks. My six year old, for example, delights in flavors of Sprite he never imagined (including a distinctly fluorescent purple “Grape Sprite”).

Most exciting of all, the machine is equipped with a Verizon USB wireless modem, and is IP-capable. Nominally, this exists so that the machine can report daily on its supply levels, allowing service to know which cartridges need restocking.

While the software on the machine is extremely primitive, it’s this networked capability that has the potential to turn the Coke Freestyle into a game changing marketing machine.

Give Customers Choice

The first step in redesigning the software on the Coke Freestyle is to start crowd sourcing new flavors. With a few simple variables, Coke could take this machine from offering 100 different drinks to thousands.

Want a Barq’s Root beer with Cherry? You got it. Coke Zero with Lemon? Whatever floats your boat.

A very small set of options could really open up thousands of possibilities:

  • Pick any base syrup
  • Pick up to two accent syrups
  • Let them “double” a flavor (e.g. Extra Cherry)

Make Customization a Game

Now the fun begins. There are a number of game design principles we could apply here.

First, the machine could highlight the “top” custom flavors that have been recently ordered on the machine. This would serve as a mechanism to obtain “votes” for new flavors that expert users create.

To help mix up selection, the machine should also highlight recent choices or randomized choices to help ensure that a few custom drinks don’t runaway with the voting.

In a perfect world, this voting would be personalized. Maybe people can name their drinks and take credit for their creation. This could be done by making the machine accessible to nearby smart phones. (BTW This might be a great reason for people to have an account with cocacola.com, tied to their Facebook or Twitter accounts)

Very soon, machines would develop their own popular custom flavors. Machines near each other can pick up flavors from the same geography. Local variations in preferences and popularity can turn into realtime market research and crowd sourced product development. Who knows? Maybe root beer with cherry syrup is a winner in East Texas? The best flavors can then efficiently be produced and promoted in geographies pre-tested by the Coke Freestyle network.

In this world, rather than devoting R&D effort to new brand extensions, Coke can focus on new base and accent syrups.

Stoke Distribution

The Coke Freestyle, by virtue of being a networked device, can also promote drinks effectively and price dynamically. If there is a big push for Diet Coke Vanilla, it can be highlighted and discounted appropriately. More importantly, like Starbucks, members with accounts can get promotional discounts and rewards to keep them coming back to Coke.

Price Aggressively

Given the incredible market research and distribution benefits of the Coke Freestyle, pricing becomes a really interesting question. Compared to “dumb” vending machines, these networked devices might enable heavily subsidized pricing. Given the relatively lower marginal cost of goods for a fountain-based machine, Coke might be able to fundamentally alter the dynamics of vending distribution by deploying machines in shopping malls and high traffic locations and undercutting competitive machines.

Who is to say that they couldn’t offer drinks at 25 cents each? If the primary goal of the machines is to generate and test brand extensions, there is a powerful motive to generate a large a volume of frequent voting.

This is the real strategic question for Coke: are they willing to disrupt well known and established vending machine economics to build out a realtime market research platform? How much is this data potentially worth?

Does Coke Get It?

For the last few months, I’ve been trying to get network access to the Coke Freestyle to experiment with some of these concepts as a Hackday project. I was disappointed to find out that the machine currently has extremely limited services exposed over the network.

I hope Coke realizes what a winner they have on their hands with his machine.

Psychohistory: 2010 in Review

The stats helper monkeys at WordPress.com mulled over how this blog did in 2010, and here’s a high level summary of its overall blog health:

Healthy blog!

The Blog-Health-o-Meter™ reads Wow.

Crunchy numbers

Featured image

The Louvre Museum has 8.5 million visitors per year. This blog was viewed about 440,000 times in 2010. If it were an exhibit at The Louvre Museum, it would take 19 days for that many people to see it.

 

In 2010, there were 23 new posts, growing the total archive of this blog to 698 posts. There were 10 pictures uploaded, taking up a total of 1mb. That’s about a picture per month.

The busiest day of the year was January 4th with 2,735 views. The most popular post that day was Café World Economics: Profit & Cafe Points.

Where did they come from?

The top referring sites in 2010 were google.com, facebook.com, twitter.com, stumbleupon.com, and zynga.com.

Some visitors came searching, mostly for ntfs mac, mkv to mp4 mac, hardest material, convert mkv to mp4 mac, and ntfs for mac.

Attractions in 2010

These are the posts and pages that got the most views in 2010.

1

Café World Economics: Profit & Cafe Points October 2009
36 comments

2

How to Convert MKV to MP4 on Mac OS X March 2008
53 comments

3

How to Mount NTFS Drives on Mac OS X with Read/Write Access May 2007
13 comments and 2 Likes on WordPress.com

4

How to Convert FLAC to Apple Lossless (MP4) on Mac OS X March 2008
18 comments

5

How to Delete Individual Backups from Apple Time Machine March 2008
45 comments

Personal Finance: Refinancing a Residential Mortgage for 2011

One of my “To Do” list items for the end of 2010 after we moved to Costa Rica and started looking for a home with Century 21 Elite Realty Costa Rica was to investigate refinancing the mortgage on our house in Sunnyvale, CA.  As a sign of the decade, this actually is the third time we’ve looked to refinance our mortgage in about seven and a half years, and I was wondering if we qualified for a individual voluntary arrangement (IVA) this time.  I was actually a bit surprised at the complexities involved, so I thought I’d share the results here on the blog.

Background

Our current mortgage is a “5/5 ARM” offered by Pentagon Federal Credit Union, a credit union that specializes in military families.  We completed that refinancing at the end of 2008, and I actually wrote a blog post about that experience if your curious about Pentagon Federal.  (Quick Summary: They are awesome, I highly recommend them for low rates on home & auto loans).

The “5/5 ARM” is an unusual program.  Like a normal 5/1 mortgage, it’s a 30-year loan with a fixed rate for the first 5 years.  Except, instead of repricing every year after that, instead, it only reprices every five years.  It reprices based on a rate tied to US Treasuries, and can rise no more than 2% at a time.

This means that if you get a mortgage at 5%, it will be 5% for years 1-5, and then can rise as high as 7% for years 6-10.  There is a cap of 5% on the total life of the mortgage, so if Obama turns out to be Jimmy Carter II and rates have to go to 20% in 2019, you’re protected.  All of this is fairly standard for high quality mortgages, except for the 5 year repricing schedule.

What makes this appealing is that the 5/5 rate tends to be the same as the 5/1 rate, so you are getting some extra stability effectively for free.  The only gotcha is that these are all FHA qualified loans, so they have to conform to their standards.  ($417K for normal mortgages, $729K in “high income” areas like Silicon Valley, 80% Loan-to-Value, etc).

The rate we got at the end of 2008 was 4.625%.  At the time, I thought that was the best rate we’d seen in 40 years, and it was good to grab.  Turns out, I was wrong about how low rates could go.

Why Did I Want to Refinance

Looking up rates on the internet can be very confusing.  The reason is that few sites offer a comprehensive average of rates, and more importantly, the ones that do tend to ignore complexities around terms like the number of points paid.  When you hear rates on the radio for a 3.875% 30-year fixed mortgage, you are hearing the interest rate that assumes a massive amount of up-front payment and some stricter-than-average terms.

I was exclusively looking for the “perfect repricing”:

  • No money down
  • Monthly payments drop
  • Interest rate drops
  • Total amount paid over life of the loan drops

You might be wondering why I would think this was possible.  Well, in 2004 and 2008, it was.  It turns out in 2010, there is no real free lunch.

Based on advertisements, and some spreadsheet calculations, it seemed like there was a real opportunity to achieve the above with current rates.  I was seeing advertisements for rates as low as 3.5% on 5/1 ARMs, which would not only drop our payment by hundreds of dollars per month, but literally would save us tens of thousands over the life of the loan.

Where Rates Are Now

This was my first surprise – it’s not that easy to get a great rate, even with great credit, with zero points.  It’s not that there aren’t great rates out there – there are, but the plain vanilla, no catches, no points and rock-bottom rate days seem to be behind us.

To evaluate options, I checked the following sources:

  • Internet searches at sites like bankrate.com
  • Quotes from big banks, like Wells Fargo and Bank of America
  • Quotes from credit unions, like Stanford Federal Credit Union and Pentagon Federal
  • Brokers like Quicken Loans

First, the Big Disappointment with Pentagon Federal

Pentagon Federal has a current price (as of 1/2/2011) on a 5/5 ARM of 3.5%.  Yes.  Awesome.  I was ready to just refinance and be done.

I should have known that there was a flaw with PenFed.  Sure, they offer great rates.  Sure, they offer clean terms.  But it turns out that there is one ugly fee that they do charge, and I was about to get caught in it.

On top of regular closing costs, title search, etc, Pentagon Federal charges a 1% origination fee when you refinance an existing Pentagon Federal mortgage.  So, for example, on a $500K mortgage, this would be an extra $5K.  Up front.  Not interest.  Not deductible.

I argued with them about it.  I escalated.  I tried sweet talk.  Nothing worked.  They admit that this is an incentive for me to leave Pentagon Federal.  They admit that it is bad for the customer.  They are not interested in changing it.

Strike 1. No worries, it’s a big internet out there, isn’t it?

Don’t Bother With These

Just don’t even bother wasting time with Bank of America, Wells Fargo, or no-name shops on the Internet offering mortgages.  You put in a bunch of time and effort, fill out forms, submit applications, etc.  The end result is underwhelming.

Countrywide, I actually miss you.

It’s pretty clear that the big banks really aren’t feeling the need to push to get people with great credit scores to refinance with them.  Whatever was driving the banks to want to “take your business” from other banks is clearly pretty weak.  I was actually a bit surprised, since I tend to think of a mortgage as a way for a bank to take a “loss leader” approach to getting a valued customer.

The Easy Orange Mortgage and Bi-Weekly Payments

ING Direct is the oddball in the group.  Since they originate their own loans and do not syndicate them, they set their own terms.  They have rates based on a $500K size and $750K size, and a variety of terms.  Definitely worth checking out, because some of their mortgages are best in class.

For example, their under $500K 5/1 is at 2.99%, with reasonable closing costs.

I spent quite a bit of time in Excel working on the options offered by ING Direct and their Easy Orange mortgages.  They offer both regular and “bi-weekly” versions.  In fact, most banks now seem to offer bi-weekly options for their loans.

If you are unfamiliar with the concept, a bi-weekly mortgage involves making a payment of 1/2 of the normal monthly payment every 2 weeks.  Since you pay more frequently (effectively you pay an extra month’s payment every year), you end up paying off your mortgage faster and with less interest.

Unfortunately, this largely seems to be a gimmick.  Technically, you can send money in early to almost any legitimate bank, and they’ll apply the early payment to principal without penalty.  Mathematically, it’s very hard to see the benefit of these type of programs once you price in the amount of cash you’d accumulate outside of your mortgage if you just put that extra payment in the bank.  Even with 0% interest, in the first ten years, there is almost no measurable benefit to bi-weekly payments at current rates.  (By the way, here is a cool website that let’s you calculate bi-weekly options without building a spreadsheet.)

As a last note, I did discover that ING has a lot of terms that are left open that could turn ugly.  For example, their Easy Orange mortgages are designed as balloon mortgages.  So in 10 years, the rate doesn’t adjust – you literally owe the entire remainder of the loan.  This is fine, if you are allowed to refinance at the time.  But ING does not guarantee you will be able to.   So, this is a great loan if you plan on selling your house before the term is up, and a bad loan if you don’t want to be caught in a situation where you have to.

Close, But No Cigar

I was very impressed with the level of effort that Quicken Loans put into helping me, even though in the end, I didn’t use them.

At first, I was somewhere between annoyed and amused when I got a phone call the next day after submitting my application.  On Day 2 when they had called 3 times, I was ready to be annoyed.  I decided to call back and let them know I wasn’t interested, but when I got them on the phone, they impressed me with the breadth of their knowledge about different options, and I was convinced they could help.

So I told them – find me a 3.5% 5/1 mortgage out there with zero points, and I’ll go with them.  I pointed them to PenFed, but didn’t tell them about the 1% fee I would face.  They went to work.

The next day, they found a few options, and I got a call from the Director of their team.  She wanted to clarify a few things in terms of income and home value, to evaluate all options.  In any case, she seemed sincerely interested in the business, which is more than I can say about any of the traditional banks.

They got close.  They found a 5/1 mortgage with $6600 up front costs and a 3.875% rate.  They also found a 5/1 at 3.5% rate, but that required $11.3K up front.  While both of these mathematically were good options compared to the 5/5 I have, I was disappointed at the size of the up front cost.

Strike 2. What’s left?

Final Decision

Fortunately, while searching the internet, I came across some great discussion boards about Pentagon Federal.  Thinking that in a world of cheapskates, I could not be the only one complaining about refinancing with Pentagon Federal.  And I was right, in a way.

In the end, I discovered 2 things:

  • There really aren’t many other mortgage options that are better than Pentagon Federal for what I was looking for.
  • Pentagon Federal has a repricing program that is documented on their website, but that they never actually promote.

Here is the program.  If your mortgage conforms to these requirements:

  • Conventional Adjustable Rate Mortgage (ARM loans) are eligible. All other types
    of loans are not eligible.
  • Loan must be 100% owned by PenFed. The loan, or any portion of the loan, cannot have been sold, or committed to be sold to Fannie Mae, or any other public or private investor.
  • No late payments showing on first mortgage payment history over last 12 months.

If you meet the terms, they will reset your mortgage to the current rate for a fee of 1%.

Now, you may be wondering why I’d be excited about this.  After all, wasn’t the 1% fee the problem with refinancing with PenFed in the first place?

The answer is simple – a 1% fee on top of normal closing costs of $3000+ is prohibitive.  A 1% fee in lieu of closing costs is pretty attractive.  No points.  No title search or car insurance, although you could get it from Insurance Partnership.  No paperwork fees.  Nothing.  Just 1%, flat.

They reset your mortgage at the current rate, give you another five years before the next repricing, and they leave your mortgage term as is.  So, since our mortgage currently completes in 2038, it would keep that completion date.

The result: lower monthly payment, lower total costs of the mortgage, dropped interest rate.

Swing and a Hit. Not perfect, but definitely the best option.  So we went for it.  Only took a phone call – no application, no paperwork.

Final Thoughts

The average duration of a home mortgage in the US is between 7-8 years, which tends to mean that mortgage rates correlate strongly with the 7-Year Treasury rates.  In the past six weeks, the rates on US Treasuries have moved up quite a bit, likely in anticipation of an economic recovery, inflation, or both.

In any case, the decision to refinance is based on a huge number of factors, not the least of which is how long you plan to stay in your current home, and how secure you feel about your current job / income stream.

But if you’ve been thinking about refinancing, and you’ve just procrastinated, I’m hoping the info above will be useful.

Personal Finance: How to Rebalance Your Portfolio

One of the prudent financial housekeeping chores that people face every year is rebalancing their portfolio. Over the course of the year, some investments outperform, and others underperform.  As a result, the allocation that you so carefully planned at the beginning of the year has likely shifted.  If left unmanaged over the years, individuals can end up with profoundly more risk or worse performance than expected.

Rebalancing your portfolio annually tries to address this issue by forcing you to sell asset classes that outperformed in the previous year, and purchase those that underperformed.  In practice, I try to rebalance the week before New Years as a way of “cleaning up” going into the next year.  While most academic research points to rebalancing as healthy every one to three years, I find that annual rebalancing provides the following benefits:

  • Forces you to see how your investments performed for the year
  • Forces you to learn which asset classes actually did well during the year, and which didn’t
  • Forces you to re-assess the appropriate “asset mix” for your risk tolerance and financial situation
  • Forces you to revisit which investments you are using to represent each asset class (mutual funds, ETFs, individual securities, etc)
  • Forces you to actively engagement with your portfolio, and reset your balance to the appropriate mix

I’ve just completed my rebalancing for 2011, and I thought I’d share some of the process here, in case it’s useful to anyone whose New Year’s Resolution is to be more proactive about their finances.

Rebalancing is actually a very simple process – it’s kind of surprising that basic financial tools like Mint and Quicken don’t actually help you do this.  Whether you’ve never rebalanced or your rebalance every year, there are fundamentally five steps:

  1. Assess your current investment portfolio, broken down by types of assets
  2. Calculate the percentage of your portfolio in each asset class
  3. Calculate the difference in dollars for each asset class in your portfolio from your ideal mix
  4. Finalize the list of investments you will use to represent each asset class
  5. Make the trades necessary (buys and sells) to bring your portfolio into balance

This can all be done within an hour, with the exception of making the trades.  Those can spread over days, potentially, since certain types of securities (like mutual funds) take time to execute (typically 24 hours).

Step 1: Assess your current investment portfolio

Believe it or not, this can actually be the most time consuming part of the process, especially if you have accumulated accounts over the years and haven’t ever used any sort of tool like Quicken to pull your portfolio together accurately.

A few ground rules on how I think about portfolio allocation:

  • I’m a big believer in the research that shows that most of your long term investment return is based on asset mix, not security selection.  This means I do not spend time picking individual stocks, bonds, or dabble with active mutual funds in general.
  • I use very broad definitions of asset classes.  For example:  “US Stocks” vs. “International Stocks” vs. “Emerging Markets”.   These tend to correlate with the standard definitions used broadly to define popular investment indexes. Technically, with sophisticated software and data, you can do very fine-grained breakdowns.  I don’t bother with this, as the resulting differences are statistically marginal vs. the effort / complexity involved.

Fear not, because with tools like Microsoft Excel or Google Docs, this has become much, much easier.  All you need to do is:

  • Make a spreadsheet with the following columns:  Security Name, Ticker, Shares, Share Price, Total Value, % of Total Portfolio
  • Fill in a row for everything you own, regardless of account

The second part is very important, if you want to avoid the “mental accounting” that leads people to invest differently in one account versus another.  If you’ve worked at multiple companies, you may have multiple 401k, IRA, college savings accounts, and brokerage accounts in your name.   Obviously, reducing the number of accounts you have is helpful, but sometimes its unavoidable.  Maybe you have a Roth IRA, a regular IRA, and a 401(k) with your current employer.

This becomes important because certain accounts may have limited access to different types of investments.  For example, your Vanguard IRA might only let you buy Vanguard funds (not such a bad thing), while your 401(k) at work might limit you to some pretty meager options.  When we get to Step 5, we can take advantage of multiple accounts to get the right balance by buying the best investments in the accounts with the best access to them.

Here is an example screenshot of a simplified list of investments that I bet wouldn’t be that unusual in Silicon Valley.  This person has some Vanguard index funds that they purchased prudently the last time they looked at their portfolio, combined with some stocks they purchased based on TechCrunch articles.  (I wish I were kidding).

You can see immediately that this small amount of accounting can actually help organize your thinking about what you own, and force you to remember why you own it.

Notice, I do not recommend putting in columns showing how well an investment performed historically.  For rebalancing, you only care about the here and now.  The past is just that – the past.  Performance data will likely just add emotion to a decision that, when made best, should be purely analytical.

Step 2: Calculate the percentage of your portfolio in each asset class

The hardest part about this step is defining what you are going to use as “an asset class”.   There is no one right answer here – I’ve seen financial planners break down assets into literally dozens of classes.  I’ve also seen recommendations that literally only use two (stocks vs. bonds).

The great thing about asset classes is that you can always break down an existing bucket into sub-buckets.

For example, if you decide to have 30% of your money in bonds, and 70% in stocks, you can then easily make a 2nd level decision to split your stock money into 50% US and 50% international.  You can then make a third level decision to split the international money 2/3 for developed markets, and 1/3 for emerging markets.  In fact, for some people, this is a much easier way to make these decisions.  Do whatever works for you, but be consistent about it.

Personally, I’ve gotten quite a bit of mileage using the following break downs:

  • Stocks
    — US Stocks
    — Large Cap
    — Mid Cap
    — Small Cap
    — International Stocks
    — Developed Markets
    — Emerging Markets
  • Fixed Income
    — Standard
    — Inflation Protected
  • Real Assets
    — Commodities
    — Real Estate

You can see in the screenshot above, calculating these buckets is fairly simple.  You just total up each group, and then divide it into the portfolio total.  So in the example I provided, the individual has 11.5% of their money in fixed income.

As the size of your assets increase, more sophisticated breakdowns are likely warranted.  But for the purposes of this blog post, I think you get the idea.

With mutual funds, this can be tricky.  For example, did you know that the Vanguard Total Market fund is 70% Large Cap, 21% Mid Cap, and 9% Small Cap?  (I got this data off etfdb.com).  In order to solve this problem, I actually create a separate column for each asset class.  I then put the percentage for each fund in each column, totaling to 100%.  I then multiply those percentages by the amount invested in each fund, giving me an actual dollar amount per asset class.

Step 3: Calculate the difference in dollars for each asset class in your portfolio from your ideal mix

This is the step where your self-assessment turns toward action.  How far are you off plan.

The hardest problem here is the implied problem: what is your ideal mix?

There are quite a few rules of thumb out there, and more than enough magazines and books out there to tell you what this should be.  Unfortunately, all of them are over-simplified, and none of them likely apply exactly to you.  At minimum, it’s a whole separate blog post to come up with this.  Fortunately, if you pick up the 2011 planning issue from Smart Money, Kiplinger’s, or Money magazine, you’ll probably end up OK.

But let’s say our individual in question is a 30-year old engineer who believes in the rule of thumb that they should take 120 minus their age, put that in stocks and the remainder in bonds.  Let’s say also that they’ve read that their stock investments should be split 50/50 between the US & International, with at least 10% of their overall portfolio in Emerging Markets.

That would leave our hypothetical engineer with the following breakdown:

  • 90% in Stocks
    — 45% US Stocks
    — 35% Developed Markets
    — 10% Emerging Markets
  • 10% in Bonds

Based on the numbers from the first screenshot, they would create an spreadsheet table like this:

This shows that our hypothetical engineer needs to rebalance by selling US Stocks, Emerging Markets, and Bonds.  The extra money will be re-allocated to international stocks in developed markets.

Step 4: Finalize the list of investments you will use to represent each asset class

Most people skip this step, but that’s a real missed opportunity.   Once you decide how much money to allocate to a given asset class, it’s worth a bit of thought about what is the best way to capture the returns of that asset class.  For example, is owning Google, Apple & Goldman Sachs the best way to capture the returns of US Stocks?  I’m not a professional financial planner, so you shouldn’t take my advice here.  But my guess is that you’ll be hard pressed to find a professional who believes that those three stocks represent a balanced portfolio.

We live in an unprecedented time.   Individuals with a few hundred dollars to invest can go to a company like E*Trade, open an account, and for $9.99 buy shares in an ETF that represents all publicly traded stocks in the US, for an annual expense of 7 basis points.  That’s 0.07%, or just $7 for every $10,000 invested.  That is an unbelievable financial triumph.  Previously, only multi-millionaires had access to that type of investment, and they paid a lot more for the privilege than 7 basis points.

Personally, I’m heavily biased towards using these low cost, index based ETF shares to represent most asset classes.  In fact, E*Trade let’s you mark any ETF for “free dividend reinvestment” under their DRIP functionality.  As a result, you get all the benefits of mutual funds with lower annual costs!  It takes some research to find the best ETFs, and in some cases, standard no-load mutual funds are a better option.  (Once again, I’m not a professional, so do your own research on what secrurities make sense for you.)

The biggest exception to this is with 401(k) plans, where you have limited choices on what types of investments you can make.  In these cases, I evaluate all of the funds in the 401(k), find those that are “best in class”, and purposely “unbalance” the 401(k) to invest in those.  I then make up for that lack of balance with my investments outside the 401(k).  For example, let’s say your current 401(k) has excellent international funds, but poor US funds.   you can skew your 401(k) to international funds, and make your US investments outside of the 401(k) where there are better options.

For our hypothetical engineer, let’s say that he’s decided to stick with Series I Savings Bonds for his fixed income, and uses the Vanguard ETFs to represent the different stock classes.

Step 5: Make the trades necessary (buys and sells) to bring your portfolio into balance

It seems like this part should be simple, but it can be surprising how many complications arise.  For example:

  • Sometimes the model says to sell $112 of something.  The trading costs alone make that likely prohibitive and unlikely to be worthwhile.
  • Share prices change every day, and your model leaves you short a few dollars here and there.
  • The model doesn’t take into account commissions for trading
  • Some funds have fees
  • Some transactions have tax consequences
  • Some investments can only be purchased in one account, not another
  • Some investments cannot be bought in a given account (like a 401k)

As a result, there is no advice that will apply to everyone.  Taxes alone make this the time when you may have to consult a professional.

In our hypothetical case, our engineer would:

  • Sell their stakes in Apple, Google, Goldman Sachs, and Teva
  • Decide to leave their Series I Bonds alone – not worth the trouble.  Take the extra money out of the developed markets stake.
  • Purchase / Sell shares in the Total Market, Ex-US, and Emerging Market ETFs to meet their new allocation goals

The following table shows how to use a spreadsheet to calculate the different trades (buys in Green, sales in Red):

Last Thoughts

I’ve been doing some version of the process above for at least fifteen years at this point, and it’s never failed to help me with my financial planning.  Of all the benefits described, annual rebalancing gives me the confidence to withstand the day-to-day gyrations of the markets, with the confidence that at the end of the year, I’ll get a shot to rebalance things.

There are a few “temptations” that I’ve noticed could lead someone astray:

  • Changing the “ideal asset mix” year-to-year based not on financial research, but based on what’s “hot” at the moment.  For example, if you find yourself saying that Gold should be 10% of your portfolio one year, and then the next year it’s “Farmland”, you’ve got some popular investing psychology drifting into your process.
  • You pick arbitrary “hot stocks” to represent asset classes.  This can lead to a double-whammy, you not only pick a bad stock, but you also miss out on key gains in your selected asset class.
  • Splitting hairs.  Don’t stress about small dollar amounts, or potentially, asset classes when your portfolio is small.  I remember investing the first $2500 I ever made from a summer job, and I got a little carried away with the breakdown.  In general, you can get pretty far with just the “Total Stock Market” and “Total Bond Market”.

This was a really long blog post, but hopefully it will prove useful to those who are interesting in balancing their portfolios, or just curious on how other people do it.  In either case, please comment or email if you find mistakes, or have additional questions.   Happy to turn the comment section here into a useful discussion.

Home Network Wireless Topology: Fixed

I think I’ve finally found a wireless network topology that works at my house.   It took a bit more equipment than I think should have been necessary, but in the end, it was a small price to pay for having my increasing array of network-dependent devices running smoothly.

Since my guess is that there are a few other suckers like me out there trying to get this to work, I’ll share my final solution.

Problem

Until recently, my home network was plagued by the following issues:

  • AppleTV in the living room would fail to stream, seamingly due to lost connections
  • Tivo HD in living room would periodically complain of being unable to connect to network
  • Nintendo Wii was shockingly slow connecting to network
  • Tivo HD in bedroom would be unable to play video from other room
  • AppleTV in bedroom would periodically fail to stream

Now, it’s not like the above happened all the time.  I never had a problem with an iPhone / iPad / Windows laptop / MacBook connecting to the network.  It was largely restricted to my video devices.  Unfortunately, it was infrequent enough that I could believe everything was configured correctly, but often enough that deep down, I felt like there were Gremlins in the building.

The Solution

The culprit turned out to be a circa-2008 Airport Extreme that I was using to drive my 802.11N network from the office.  It turns out, the older Airport Extreme can handle either 2.4Ghz or 5Ghz frequencies, but not both simultaneously.  Since the iPhone / iPod uses 2.4Ghz, for compatibility you are effectively stuck at 2.4Ghz.   In addition, my office is literally at one corner of my house from the bedroom.  Not ideal, spatially, for the hub of my network.  The living room is more centrally located.

I began to suspect that the number of wireless devices that I owned had crossed some threshold, and the amount of interference and cross-talk was leading to unpredictable behavior.

As a solution, I purchased a newer Airport Extreme base station, with dual-band support.   However, instead of replacing the old base station, I added it to the living room as a network extension of the existing wireless network.  In order to do this, you need to do the following:

  • Open up the “Airport Utility” in the  “Utilities” folder in “Applications” (on Mac OS X 10.6 Snow Leopard)
  • Click the “Manual Setup” button in the bottom left, to configure the base station
  • Select “Extend a wireless network” under the “Wireless” tab

It’s a little tricky, but there is no option to extend a network under the default set-up flow.

This provided three benefits:

  • All the devices in the living room are now connected via Ethernet to the Airport Extreme base station.  Significantly less chatter on the network.
  • The newer devices in the house are now seamlessly connecting via 5Ghz when they can to the Living Room base station
  • The bedroom devices are selecting the living room base station instead of the office due to signal strength.

Basically, there is a fairly constant 2.4GHz wireless “pipe” between the Living Room base station and the Office base station, and devices through the house are auto-selecting to the best connection.  The living room is aggregating the traffic over it’s ethernet switch and wireless endpoints, and then piping to my office network when necessary.

In the office, my iMac (which is my iTunes server) is connected via a Gigabit Switch to the Airport base station, the Infrant ReadyNAS NV+, and the AT&T Uverse Router.

I’m assuming that the bridging implementation between the two Airport Extreme base stations is extremely efficient – more efficient than having a large number of device independently competing for access to the base station in the Office.

I’ve already noticed now that the new AppleTV 2 devices are extremely happy with this setup, and I get 720P HD streaming in both the living room and the bedroom from the iMac with only a few seconds of buffering.  Hopefully, this should prove a durable and performant topology for 2011.

Adam Nash is Metro Man

I got this sent to me in email today.  It seems to have become a running joke among a few of my fellow LinkedIn employees.

Two thoughts immediately come to mind:

  1. Do I need to change my official superhero for 500 Startups? (currently: Optimus Prime)
  2. Am I missing something funny about this comparison?  This seems way too flattering…

I wills say one thing – I’m going to have to hit the gym a bit more to fit into that costume.

Why T-Shirts Matter

During my tenure at LinkedIn, I’ve held a wide variety of roles and responsibilities within the company.  Some are fairly public (as described on my LinkedIn profile).  Others are the the type that you’d never find formally discussed, and yet would be no less true if you asked anyone who worked at the company.

In a rare combination of serendipity, passion, and empowerment, I personally ended up with one of those unspoken roles: the most prodigious producer of LinkedIn t-shirts.

2010 LinkedIn for Breast Cancer Awareness Shirt

At the recent Silicon Valley Comes to the UK trip, I had the chance to have a great conversation with Dave Hornik on why making t-shirts matter to high tech start-ups.   Believe it or not, I felt that this was a subject important enough to capture in a blog post.  (My friends from The Clothing People and I will write a separate blog post on how to make truly great high tech t-shirts, which is a field of expertise unto itself.)

Why T-Shirts Matter

At a high level, understanding the typical culture at a high tech startup can be difficult for those who haven’t worked for one.  The best analogy I can think of is to put yourself back in time, to when you were between 8 – 12 years old.  Now, think carefully about the things that 8 – 12 year old boys like (at least, the geeky ones).  Video games.  Caffeine.  e-scooter from this excellent guide.  Toys.  Computers. Bean bag chairs.  Junk food.  This should help orient you, and brings you to the right frame of mind about t-shirts.

T-shirts are a part of that culture.  In part, t-shirts represent the ultimate middle finger to those unnamed sources of authority who wanted software engineers to dress like “Thomas Anderson” in the Matrix.  Software engineers want to be Neo, not John Anderson.

This leads us to the reasons why t-shirts matter:

Empowerment.  In some ways, engineers delight in having found a profession where their intellect and passion for technology have enabled them to earn a great living and work at a company where – yes, you guessed it – they can wear t-shirts to work.  Giving out t-shirts tells your employees, implicitly, that you get it.  You hire only the best, and the best can wear whatever they want.  It says you know that you value merit over appearance; a working prototype over an MBA.

Incentives.  Over the past decade, behavioral finance has taught us that people don’t value money rationally – it varies depending on form and context.  You can bring a $20 bottle of wine to your girlfriend’s parents’ house and be thought a gentleman.  Handing her Mom a $20 at the door isn’t looked on the same way.   Let me just tell you, free t-shirts evoke some sort of primal response at a high tech company.  I’ve often said that I would see less interest at a high tech company handing out $100 bills than handing out free t-shirts.  High tech companies are filled with benefits that cost hundreds of thousands of dollars per year, benefit a minority of employees, and are generally under-appreciated financially.  You’d be shocked at what a $200 per person per year budget for t-shirts will do for employee morale comparatively.

Tribal Cohesion. There are a lot of reasons why many institutions require employees to wear uniforms.  Common appearance can be a reminder that the person represents the company.  More importantly, common dress signals who is “part of the tribe” and belongs to the corporate family.  Uniforms are incompatible with the “empowerment” aspect of how people want to dress, but t-shirts can represent a form of “voluntary uniform” if produced in sufficient variety and quantity.   This effect can be had at a team level, when a t-shirt is made just to celebrate a new product, or at the company level.  It has a profound effect on new hires, as well, who desperately want “a shirt” so they can fit in.  It may sound subversive, but t-shirts can provide many of the same benefits of camaraderie and tribal cohesion that uniforms did, without the top-down oppression.

Tenure Based Seniority. High tech companies are largely meritocratic, and as they grow they tend to define roles based on skills & experience rather than “time at the company”.  However, there are positive aspects to rewarding those who have “bled for the company” over the years, and put their hearts and souls into building the business.  T-Shirts, in an innocuous way, implicitly do this by almost always becoming “limited editions”.  Want the t-shirt from the 2007 company picnic?  You had to be there to get one.  How about the shirt from the first intern program?   The launch of a game-changing new product?  Even shirts that are given out to the whole company will become rare at a company that’s growing rapidly.  In a socially acceptable way, t-shirts subtlely communicate a form of tenure that is warm, and yet structured.

Branding.  As discussed under “Tribal Cohesion”, people want to wear the brand of their tribe.  They will wear them out everywhere if you let them.  Let them.  While being careful not to interfere with the uniqueness of shirts given to employees, make shirts for your developers, your fans, your early adopters.  Long before they become vocal advocates for your brand, they will gladly showcase it if you let them.  This tends to work best in relatively inter-connected, dense, techy cultures like Silicon Valley, but you’d be surprised how far your reach might be.  Of course, this assumes that you make shirts that don’t suck, but we’ll cover that in the next blog post.

So How Do I Make Great Shirts?

It turns out that this is a lot harder than it appears.  Mario always tells me my blog posts are too long, so I’m going to save this topic for the next post…

Steve Jobs is The Mule. Is There a Second Foundation?

This blog post could have been titled “We don’t live in the universe of maximum probability“, but that didn’t sound quite as exciting.

This weekend, I was having a friendly debate with a close friend about the state of the open web, when the now typical issue rose up: Apple, it’s support of native applications, and the resulting impact on the web.  I immediately thought about the fact that, in the 1990s, we would have never have dreamed of the technology landscape of 2010 — a landscape where Apple was the dominant force in mobile computing.  A world where we would see a massive resurgence and interest in client applications (yes, that’s what those pretty iPhone and Android apps are).  A world where Apple was the most valuable technology company in the world.

Then it hit me.  The parallel to one of the best science fiction stories of all time.  In fact, it’s the story that led to the name of this blog.

Isaac Asimov’s Foundation Trilogy.

Asimov’s Foundation is based on the future history of the Galaxy, when a lone scientist, Hari Seldon, invents a new science called “Psychohistory“, that allows him to predict the future.  This science allows him to project that the Galactic Empire will crumble and bring about 30,000 years of dark ages.  Instead, he develops a plan to create a “Foundation” to preserve knowledge, and reduce the period of regression to a mere 1000 years.  Unfortunately, his plan is disrupted by an unpredicted complication.

Check out this synopsis from wikipedia, and see if it sounds familiar:

The Mule is a fictional character from Isaac Asimov‘s Foundation series.[1] One of the greatest conquerors the galaxy has ever seen, he is a mentalic who has the ability to reach into the minds of others and “adjust” their emotions, individually or en masse, using this capability to forcibly enlist them to his cause. Individuals who have their emotions adjusted behave otherwise normally, with their logic, memories and personality intact; even if they are aware of the manipulation, they are unable to desire to resist it. This gives the Mule the capacity to disrupt Seldon’s plan by invalidating Seldon’s assumption that no single individual could have a measurable effect on galactic socio-historical trends on their own, due to the plan relying on the predictability of action of very large numbers of people.

Tell me that doesn’t sound like Steve Jobs.  You can read the full article here.

Just replace:

  • “Steve Jobs” for “The Mule”
  • “Apple” for “The Union of Worlds”
  • “The Open Web” for “Seldon’s Plan”

And I think you have a fair approximation of what’s happened in the last five years.

One of the hottest debates in mobile right now is whether to focus on the mobile web or native applications.  Ironically, Apple is the one who started this debate, since they were the first company to launch a phone with a truly modern web browser (Mobile Safari), and then proceeded to launch a simple, accessible native application platform on top of it.

In all seriousness, the reason that the native applications on the iPhone (and iPod / iPad) are such a viable threat is due to the fact that they are working.  When I say working, I mean that any company who takes their mobile web property, and then deploys a native iPhone application, tends to see a significant boost in their engagement metrics.  Apple has solved a distribution and engagement problem for mobile applications at an unprecedented scale, and it shows in the numbers.  Metrics usually speak louder than philosophy when making tactical decisions, which is why you see the incredible investment and interest in native applications for iOS devices.

In the story, the Mule is defeated by the Second Foundation, and rendered harmless and without ambition.  He dies without a successor, hence the name “The Mule”.

I think the question we should all be asking at this point is, “Is there a Second Foundation?

America the Beautiful 5 Ounce Silver Coins are Gorgeous

The United States Mint has released images of the new 5-ounce America the Beautiful Silver Bullion Coins and they are gorgeous.

Source: Coin News

The designs are legislated to be nearly identical to the new quarters, except that they are a full three inches in diameter, making them large (and thin).

These are the first five ounce bullion coins produced by the US Mint, and are not going to be sold directly from the US Mint website.  This, of course, means that collectors are going to have to pay an unnecessary markup from dealers to get them.

Silver is now trading at over $21 / ounce, so expect each coin to be at least $110.

They are making one for each of the new quarters – five in all.  They are minting 500,000 total – 100,000 each.

I expect these to sell out quickly due to their unique design, size, and limited mintage.  $110 might seem expensive for a coin, but compared to escalating gold prices, silver arbitrarily seems “affordable”.

Solution: Denon A/V Receiver AVR 1410/790 With No Sound

As per my normal practice, when I spend more than 20 minutes on a technical problem, and I find the solution difficult to find on Google, I document it here.  I figure that at least I can save some time for “the next sucker” who runs into the issue.

Tonight, I came home to find my wife complaining that she could not get sound from the Living Room TV.  I was able to quickly ascertain that the issue was the Denon AVR-1410 – our A/V Receiver.

Symptoms:

  • No ability to select normal audio surround options – only stereo ones show up in the selector
  • No sound on any A/V input, whether HDMI or analog
  • When you go to configuration menus, the option to configure the audio is missing
  • A periodic clicking noise
  • The display periodically of “H/P Input On” on the receiver face

That last one was the real key to the puzzle.

After browsing countless pages, I found this one:

Fixya: H/P Input On

Solution:

  • H/P stands for “Headphones”
  • Some dust has likely gotten into the Headphone jack, creating a false auto-detection that you have headphones plugged in
  • The switching noise is the short/detection firing on/off
  • To fix it, you just blow hard in the headphone jack to clear the dust (or you can plug in headphones and remove them)

So, after messing with wires, configurations, and every button on the remote, a simple puff of air did the trick.

Hope this helps someone out there.  I thought my A/V Receiver was possessed for a little while.

Want Engagement? Find the Heat.

If you talk to product managers, designers, and engineers at almost any consumer internet company these days, you’ll find that they measure their success largely across three dimensions:

  • Growth (more users)
  • Revenue (more money)
  • Engagement (more visits, more activity per visit)

Believe it or not, it’s that last bullet which is the ultimate coin of the realm: engagement.  How to measure it.  How to design for it.  How to predict it.  How to generate it.

The assumption is that engagement is a proxy for the strength of the relationship with the consumer, and thus leads to both strategic advantage as well as long term monetization.

There is no one simple answer to the question of how to design and build highly engaging products and features.  Game mechanics (thanks in large part to Amy Jo Kim) has become the de facto answer for designing for engagement on the consumer internet in the past few years.  However, in the last few months, I’ve been advocating a new frame for product managers and designers to think about engagement in their products, particularly content-based applications.

Find. The. Heat.

Given the phenomenal success of Google, most modern consumer internet companies are heavily influenced by its product culture, whether they care to admit it or not.  Google made relevance the gold standard for content, and machine generated algorithms for sifting and sorting that content the scalable solution.

But when it comes to content, it’s worth considering things that frankly our colleagues in old media have known for a very long time.

There is a big difference between:

  • Content that you should read / view
  • Content that you want to read / view
  • Content that you actually read / view

It’s not an accident that there are a spectrum of news content, ranging from PBS -> 60 Minutes -> CNN -> Fox News / MSNBC.

The difference?  Heat.

For several years, I’ve been largely focused on designing products with two separate goals in mind, always in tension.  Relevance: ensuring that the content and features presented to the user are as productive as possible.  Delight: ensuring that the user experiences that mix of surprise, happiness, and comfort from using the product.  Jason Purtoti, former designer at Mint.com and current Designer in Residence @ Bessemer, has often advocated for designing for delight.

Heat, however, is not the same as delight.  But heat might be more important than delight for content-based applications.

Let me explain.  Heat covers a multitude of strong emotions.  Vice.  Virtue.  Delight.  Disgust.  Anger.  Thrill.

You can generate heat by showing people content they love… and also by showing them content that they hate.  When you get to the heart of why people share content, you realize that Youtube had virality long before social networks, feeds, and other forms of viral growth were around.  What they had was content that people wanted to share so much, they would cut and paste arcane text strings into emails and send them around.

Heat make many technologists uncomfortable.  First, it’s emotional and irrational.  Second, it’s typically at odds with strict definitions of relevance and utility.

But like the theme of this entire blog, people are predictably irrational.  TV Producers and writers tend to be experts in detecting heat from their audiences, and generating content to match it.  I believe that, just as Google revolutionized the automatic surfacing of relevant content, we can also automate the surfacing of content that generates heat.

This is fairly obvious in politics, as an example.  I can generate highly personalized and relevant content by showing liberal users articles from Daily Kos about health care.  But I can generate heat from that same audience by surfacing articles by Karl Rove on the same topic to those users.

Which are they more likely to click on?  Which are they most likely to share?

Which one generates the most heat?  Which one is “better” for them?

Please note, I am not advocating designing for heat as any form of solitary framework for building engaging products.  However, I have personally found in the past few months that this line of thinking helps inspire me to come up with far more interesting ideas for feature design.  It also seems to help teams that I work with get over mental blocks that lead to dry, boring, unemotional, data-driven content features.

Try it.

Find the heat.

The Incentives for Inflation Going Forward

In my last blog post, Lessons from the Masters of Deflation, I alluded to an upcoming article on why I expect heavy pressure towards inflation in the United States in the coming years.  I don’t think I’m at all unique in this projection – there are currently a huge number of economics and financial analysts that expect significant inflation in the coming years in the United States.  The rationale is almost universally the unprecedented expansion of the money supply by the Federal Reserve.  With over $2 Trillion on the balance sheet, and the acquisition of debt of questionable value, it’s easy to look at the incredible growth in M2 (a measure of money supply) and project out inflation once the economy recovers.

My rationale for significant inflation in the future is not actually based on these facts, although I don’t dispute them per say.  The rapid deleveraging of our economy argues for short term deflation.  The massive and hastily executed fiscal stimulus and monetary expansion argues for long term inflation.

I’m going to argue instead that you should just follow the incentives.  It is fairly obvious that a vast majority of Americans will benefit in the short term from a significant devaluation of the dollar.  If you believe that this country’s politics (and economics) tend to follow the majority opinion, then it seems like just a matter of time before we talk ourselves into policies that lead to inflation.

For the sake of argument, let’s assume that we could conjure up an instant 25% devaluation of the dollar. In this world, everything that costs $1 now will cost $1.25 tomorrow. Let’s look at some of the large groups of Americans that will benefit from this type of massive devaluation:

  • Homeowners.  A dominant majority of American households are homeowners, and almost all of them carry weighty mortgages.  More importantly, by some counts, almost 20% of those mortgages are underwater.  Inflation to the rescue!  In this world, every $400,000 house is now worth $500,000.  But of course, the mortgages themselves don’t grow, since they were written in the past.  Debtors love inflation, because they get to pay off old debts.
  • Federal Government. This is a two-fer.  First, most of our taxes aren’t indexed to inflation.  Want to keep that promise to tax only people making over $250K?  Devalue the dollar, and now more people will cross that threshold.  Capital Gains taxes?  Booyah, those aren’t indexed to inflation.  Now everyone whose stock just keeps pace with the devalution will owe taxes to boot!  Besides increasing revenue, the devaluation makes it easy to pay off bond holders of those trillions of dollars of debt, since they are all denominated in dollars.  With benefits like these, why stop at a 25% devaluation?  Let’s go for 100%!
  • Consumers in Debt. The average American has thousands of dollars in debt.  Assuming that wage inflation approximates price inflation, consumers can benefit from seeing increased nominal wages, and then paying off debts that were made before the devaluation.

Sense a common theme here?  Debtors.  The United States is a nation of debtors.  Individual households are in debt.  State governments are in debt.  Homeowners are in debt.  The Federal Government is in debt.  Debtors, in the short term, love devaluation because it means they get to pay off old borrowing with inflated currency.  On average, we are in debt, which means, on average, we’re incented to devalue the dollar.

In fact, you could argue that Japan, a nation of savers, has been stuck in a deflationary spiral precisely because, as a nation of savers, they benefit on average from seeing their saved Yen go farther at the market.  Of course, the younger Japanese don’t see those benefits, but thanks to aging demographics, they are outnumbered by older generations who saved massive amounts of wealth.

Yes, I know I am grotesquely oversimplifying the ramifications for all parties involved once an inflationary spiral takes hold.  And believe, me, I do not believe that this is a good outcome for the country (or the world economy).

Of course, I am a saver, so I would be biased against inflation…

Lessons from the Masters of Deflation

You can’t open a decent newspaper these days without coming across an article warning of impending deflation.  (Yes, I know.  How many people still open a decent newspaper?) Deflation, the Bizarro twin of inflation, has been a major concern for the United States since the financial crisis unfolded in 2008, and fears of a Japan-style lost decade emerged.

We’re now two years into the unfolding drama, and fear of deflation has resurged in the past few months as the sovereign debt crisis in Europe has led to a spike in the value in the dollar, a potential for weakening global demand, and the threat of a double-dip recession.  While I personally don’t believe we’ll see an extended period of deflation given the current monetary & fiscal incentives in our country (a blog post on this topic is coming), I do think a few years of borderline deflation may still occur.

From today’s Wall Street Journal:

The old bogeyman of deflation has re-emerged as a worry for the U.S. economy. Here’s something else to fret about: After studying more than a decade of deflation in Japan, economists have slowly realized they have no idea how it works.

Every time you see a piece on deflation, you find references to Japan.  This is not unexpected – Japan is the second-largest economy in the world, and it wasn’t too long ago that many highly educated people thought that it would usurp the US role as the dominant western economy.  This is really the only large-scale modern example of deflation – to find another you have to revisit the 1930s, and too many elements of our system have changed for those analogies to be completely helpful.  In fact, I see some pieces stretch back into the 1890s at times.

Unfortunately, Japan has been a wreck of an example.  They pursued massive borrowing and Keynesian stimulus, running their national debt to over 200% of GDP.  In fact, the most notable thing that they’ve achieved is setting incredible new records for the potential debt a country can take on without completely imploding.  This is similar in some ways to new records being set for over-eating.  Impressive, scary, and not something that inspires you to try it yourself.

However, if you want to understand deflation, and more importantly how to handle deflation, you need to turn to the true masters of deflation.  That’s right, living in our midst, there are huge multi-billion dollar economies that have not only survived a deflationary environment for forty years, they’ve thrived in it.

I’m talking, of course, about the children of Moore’s Law: our high tech industry.  Moore’s Law (circa 1975), loosely put, predicts that the number of circuits that you’ll be able to put in a semiconductor for a fixed cost will double every two years.  This is the equivalent of saying that the price of a circuit will drop by 50% every two years.

That’s deflation of 22.47% per year.  Put that in your pipe and smoke it.

But the industry has thrived, and looking at the financial structure of high tech companies, you can learn a lot about the topsy-turvy logic of deflation and how individuals can cope.

  • Debt is Bad. For decades, high tech companies have resisted the traditional financial wisdom of adding leverage to their balance sheets.  Why?  Theoretically, leverage is one of the key ingredients in Return on Equity, a primary measure of financial performance.  The answer is, when it comes to deflation, debt can kill you.In an inflationary environment, being a lender is tough.  There is a risk that inflation will eat of the gains (or more) of the interest you are charging.  If I loan you $10,000 at 5%, and inflation jumps to 8%, I’m losing 3% on the deal.   $300/year lost purchasing power is tough, but imagine that being $3B on a $100B loan portfolio.  This is because as a lender, my return is the interest rate I charge MINUS the inflation.In a deflationary environment, roles are reversed.  As a lender, I’ll lend you money at 0%!  After all, if deflation increases the value of a dollar by 3%, then I effectively make 3% on a 0% loan.  My return as a lender is the interest rate PLUS the deflation.  But the borrower has the other end of the deal.  Not only do they have to pay the interest, but they have to pay it with higher value dollars in the future.  Ouch.

    Moral of the story: In a deflationary environment, you do not want to owe debt. This is why deflationary environments lead to massive deleveraging.   You do not want to be caught holding a check denominated in low value today dollars, and forced to pay it back with higher value tomorrow dollars.
  • Don’t Buy Today What You Can Buy Tomorrow. This is something that any avid purchaser of computer equipment knows.  You pay a lot for the privilege of buying computing power today.  I guarantee you, it will be cheaper 6 months from now.  Want a 2TB hard drive?  Just wait a few months for significant discounts.  Want that Mac Mini?  It will be cheaper (or faster) in a year.  Same item, same condition, same quality – lower value in the future.  That is what deflation looks like.In a deflationary environment, on average items will cost less in dollars in the future than they do today.  So if you don’t need it now, you should wait.  In fact, you are paid to wait.  Literally.  High tech companies know this – they don’t source components until they absolutely need them to put in boxes.  High tech consumers know this.  Want to buy a 42″ LCD TV?  Wait a year, I promise you that exact same model, brand new in the box, will be a lot cheaper.This may not seem weird to you, but think about it for a second.   It’s not normal.  In order to keep the box the same price, most consumer products companies literally shrunk what they are offering you, or raise the price.   In high tech, they regularly have to double what they give you every two years, just to keep the price the same!  This is also why high tech companies are desperate to unload inventory as soon as possible… within days.  When I was at Apple, we moved our days of inventory on the books from eight week to just under two days!  Dell at the time was at six days.  Just six days of inventory!  That’s how you handle deflation.

    Moral of the story:  If you don’t absolutely need it now, wait. In inflationary environments, we buy now to avoid paying a higher price in the future.  In deflationary environments, the later you buy, the cheaper it is.  So don’t buy it unless you need to use it, immediately.

  • Success Depends on Increasing Value through Innovation. We take this for granted now in the high tech industry, but let’s face it:  high tech is unique.  If the internal combustion engine followed Moore’s law, we wouldn’t be worried about oil usage right now because we’d all be getting over 1M miles to the gallon.What people don’t realize about Moore’s Law is that it isn’t some government regulation.  There is no one handing out 2x performance every two years that high tech companies can just cash in periodically.  Literally hundreds of thousands of brilliant people, across a range of disciplines, degree programs, and commercial ventures are constantly ahead of the curve, inventing the technologies that will deliver that incredible curve.It’s a trap, in a way.  The innovation that makes the deflationary environment a fact is also the path to surviving it.  If you miss the next step on the curve, you’ll find that your products quickly are only worth half as much, and your more innovative competitor will still be collecting full price.

    This is tough to handle at an individual level.  In an inflationary environment, everyone gets some form of raise to “adjust for inflation”.  In a deflationary environment, everyone should get a pay cut to “adjust for deflation”.  However, since employees, managers, unions and even governments hate to see this happen, you tend to see layoffs instead.   It’s a vicious productivity war.  If you want earn the same paycheck next year, and deflation is running at 3%, you have to be 3% more productive to make that math work for the business.   At the company level, you need to see companies that can deliver productivity gains every year at a rate above deflation, just to tread water.

    Moral of the story:  There is no coasting in a deflationary environment, no rising tide that lifts all boats. Inflation may be an illusion of more money, but it’s an illusion that people emotionally depend on.  Deflation forces people to come to terms with a basic economic fact – if you aren’t able to make more with the same cost next year, you’ll likely be worth less next year.

I’ve obviously oversimplified a fairly complicated macroeconomic situation in the comments above.  However, I’m hoping that the insights provided will be helpful to those of you who have trouble visualizing what deflation might look like, in practice.  If there is interest, I may put together another post on what types of investments perform best in a deflationary environment.