Zynga, Equity & Tough Decisions

A couple of days ago, a story broke in the Wall Street Journal about Zynga “leaning” on some early employees to surrender portions of their equity.  Not surprisingly, this blew up a bit in the press, leading to a wide number of articles talking about the potential threats to the Silicon Valley equity culture, employment litigation, and a number of fairly serious issues.

As Zynga has indicated that their IPO is imminent, no doubt a lot of this is fueled by the fact that Zynga is a hot company right now.  But some of the issues raised are very real, and I thought it might be interesting to lend a different perspective to the story as a opportunity to think more deeply about the challenges leaders face in hyper growth companies, even ones as successful as Zynga.

Executives are expensive

Marc Andreesen wrote a great blog post on some of the very real issues around hiring, managing and firing executives in hypergrowth technology start-ups.  It’s too long to capture everything here, but I do recommend reading it. Marc calls it the “executive firing paradox”:

It takes time to gather data to evaluate an executive’s performance. You can’t evaluate an executive based on her own output, like a normal employee — you have to evaluate her based on the output of her organization. It takes time for her to build and manage her organization to generate output. Therefore, it takes longer to evaluate the performance of an executive than a normal employee.

But, an executive can cause far more damage than a normal employee. A normal employee doesn’t work out, fine, replace him. An executive doesn’t work out, it can — worst case — permanently cripple her function and sometimes the entire company.Therefore, it is far more important to fire a bad executive as fast as possible, versus a normal employee.

Now, the facts of the Zynga story are a bit blurry in the press, but for the purposes of this blog post, I’m assuming the following:

  • This issue affected a relatively small number of people at Zynga, specifically executive-level hires
  • These people were identified, over time, as underperformers at the original role they filled
  • These people still had not vested their equity

Obviously, the above distinctions above matter greatly in terms of the tricky balance of issues around making a decision like this.

It’s worth noting, however, that executives are expensive hires.  If an executive is vesting 250K shares per year, and hiring a new engineer or designer costs 10K shares per year, then that person really has to deliver an incredible amount of value to justify their compensation.  After all, you could use the money to hire 25 additional engineers.  A great leader can easily justify that value (and more) in terms of their power to create long term value for the company, but it’s definitely a high bar to clear.

The Reason for Vesting

Not to be pedantic, but there is a very good reason why employees at tech companies are given equity.  Fundamentally, the best corporate cultures in Silicon Valley are based on people working together not to just build technology or products, but actively working to build a great company.  Stock ownership is an important part of that culture – when people have meaningful equity in a company, it cements the idea that everyone is a part-owner of the business.

Four years may not seem like a long time, but in truth, hypergrowth tech companies grow and change at rates that seem theoretically impossible.  Zynga had 150 employees in 2008.  LinkedIn had fewer than 400.  As a result, the responsibilities and requirements of almost any position at the company radically change in a year, let alone four years.  This is one of the great opportunities that high tech companies afford employees who take advantage of growth to stretch and grow quickly into new responsibilities and experiences.  But it’s extremely challenging, and fairly unforgiving as hypergrowth means that every person’s efforts potentially impact dozens of employees going forward and millions of users.

Vesting exists as an important reminder, however, that your share of the company is earned over time, not at signing.  You earn your share of the company – every day, every month, every year.  For most people, this isn’t an issue, because it is amazing how dedicated people are in Silicon Valley.  People are passionate about what they do and the teams they work with, and that passion translates into world-class dedication and effort.

Real Equity, Real Money, Really Tough Decisions

Back to Zynga.  Let’s assume, for a second, that you have the situation described in the Wall Street Journal.  You’ve identified a small number of relatively high level employees who, for whatever reason, you decide are underperforming their original roles.  Normally, there are a couple of options:

  1. Tolerate the under-performance, or compensate for it with additional hires, but let them “vest out” their stock grants despite the fact that they aren’t filling the role that the equity was predicated on.
  2. Fire them.

As per Marc Andreesen’s post, option (1) is toxic.  The equity, while material, isn’t the dominant issue.  The impact to the company culture can be devastating, and if a repeated pattern, permanently damaging to the ability of the company to attract and retain the best talent and have them do their best work.

Let’s not forget also that we ask our company leaders to be thoughtful of their responsibilities to shareholders as well, particularly in public companies.  Executives are expensive hires, and equity allocated to them could always be allocated to hiring other great people.  Human beings tend to suffer from “sunk cost fallacy”, and they hate to admit mistakes and take on difficult confrontation.  Option (1) swims in all of those issues.

But option (2) doesn’t always feel right in a hyper-growth company either.  What if the employee has a number of positive attributes and skills?  What if you would gladly hire them today, just in a different role?

From the press, it looks like Zynga tried to find a third way.  Rather than fire the employee, offer them the ability to stay at the company in a role that better suits their performance, with compensation to match.

You may not agree with that approach, and I think Semil Shah does a good job in TechCrunch talking about the cultural issues that this type of approach can cause.  But it would be foolish not to see that this is really a tough decision, and shouldn’t be trivialized or sensationalized.

Talking vs. Doing

There has never been a shortage of armchair quarterbacks and theorists debating the merits and demerits of different leadership actions and company cultures.  It’s part of an ecosystem that rewards thinking and learning.

It’s relatively simple to have a knee-jerk, emotional reaction to a piece like the one in the Wall Street Journal.  Let’s face it, that’s part of the reason they published it.  Companies like Zynga are amazing, and more importantly, they matter.  How they grow, navigate, succeed and fail is part of how we all learn to build better high tech companies.

It’s fairly easy, in fact, to demonize actions that you don’t agree with.  However, it’s often a much more productive intellectual path to ask yourself, “Why would good, smart, ethical people do this?”  Whether you agree or disagree with the actions taken by Zynga here, these are very hard decisions, and there is a lot for aspiring technology leaders to think about and learn from.

As Tom Hanks said in “A League of Their Own”:

If it wasn’t hard everyone would do it. The hard is what makes it great.

The Synology DS1511+ RAID NAS & Time Machine on Mac OS X Lion

I recently suffered one of those storage network failures that you have nightmares about.  After spending more than $1000 on a NetGear ReadyNAS NV+, I had a catastrophic failure that cost me all of the data on the system.  Believe it or not, it was a single drive failure – exactly the type of problem you spend money on a RAID system to survive.  Unfortunately, in my case, it didn’t.

On the bright side, I had the opportunity to rethink and rebuilt my storage and backup solutions from scratch.  In a recent blog post, I described my new network and storage topology.

Synology DS1511+ to the Rescue

The Synology DS1511+ is a great device.  It sits on your Gigabit network, handles up to five SATA hard drives, and can act as a wide variety of servers for your network.  I configured my with five 3TB Western Digital Caviar Green drives, for 15TB of notional storage, 8.3TB of usable storage.

The Synology supports “dual drive redundancy”, so for the price of 2 drives worth of storage, you end up with protection for your data even if two drives fail simultaneously.  Needless to say, I went for that option.

The industrial design of the box is well done.  You do have to break out the screwdriver to install the drives into trays (not quite as nice as the Drobo FS plug-and-play SATA drives), but the case itself is small, quiet and black.  It also has nice locks on each drive bay, which has made it “child proof” for my 2 year old who is unfortunately fascinated with the blinking lights.

The Synology box is incredibly fast.  First, it supports two Gigabit Ethernet ports, to establish connections from multiple clients independently.  But even from one machine, it’s wicked fast.  Simple Finder copy of a 500MB file to the drive takes under 6 seconds.  I was able to back up 2.7M files totally 4.05TB in size using Time Machine (usually dog slow) in about 26 hours.

The Synology management software is Windows 2000 like in terms of its user interface and incredible breadth of options.  Needless to say, I only use about 1% of them.  I did run into one issue, and hence the title of this blog post.  Configuring the box for Time Machine on Mac OS X 10.7 Lion was non-trivial.

Time Machine on Mac OS X 10.7 Lion & Synology DSM 3.2

Time Machine, unfortunately, is the most consumer friendly solution for incremental backup on the Mac.  Unfortunately, if you have multiple machines, you run into a small issue: Apple designed the software as if it “owns” the entire drive you point it at.  As a result, you can’t just point all your machines at a single network drive without a number of bad things happening.

Instead, you have to somehow convince Time Machine to only use part of the drive.  This turned out to be quite an issue for me, since I wanted to be able to backup my machine (~4TB) as well as my wife’s MacBook Pro (~500GB).

Synology has published documents on how to configure the box for Time Machine, and has designed it’s software around a very clever option.  The basic idea is that you create a different “user” for each machine you want to back up with Time Machine.  For each user, you assign a limited quota, and then you tell Time Machine to use that user for the Synology volume.  It actually works quite well, although it feels a little strange to create separate user accounts for each machine, on top of accounts for each user.

The Undocumented 4TB Limit

Unfortunately, I ran into an undocumented issue.  When I tried to set the quota for my machine to 6000 GB (in general, you want to give 50% extra room for incremental changes / backups), Time Machine would only see about 1.8 TB.  When I checked the DSM 3.2 interface, I found indeed that it had reset 6000 GB to 1804 GB.  After trying to set it several times with the same issue, I deduced that the maximum limit was 4096 GB, and that it was “wrapping” around that number.  Sure enough, entering 4100 -> 4, and entering 4096 actually turned to 0, shutting off the quota entirely!

After some back and forth with Synology customer service, they finally admitted this was true.  (The first two times, they claimed that the issue was with Mac OS X 10.7 Time Machine not respecting quotas.)  I hope they fix the software to at least tell the user when they type a number over 4095 that they’ve exceeded the limit.

The Solution: Disk Groups, Volumes & Shares

To solve the problem, I reverted to a more old-fashioned solution: partitions.  Of course, with a sophisticated, modern RAID box, this was a bit more complex.  The Synology DSM 3.2 software supports three relevant concepts:

  • Disk Groups:  You can take any number of the drives and “bind” them together as a disk group.
  • Volumes:  You can allocate an independent “volume” of any size over a disk group.
  • Shares:  You can specify a share on a given volume which is available to only certain users.

The key here is that normally you use quotas to limit storage on shares for specific users.  But since I was looking for a “6 TB” share, there was no way to do this.  By default, shares get access to the entire volume they are on, so the key was to repartition the box into separate volumes.

As a result, I configured my box as follows:

  • One disk group across all 5 disks, configured for dual drive redundancy using Synology Hybrid Raid (SHR)
  • Three volumes: one for my iMac’s time machine (6000 GB), one for my wife’s Macbook Pro (1000 GB), and one remainder for network storage (1.3 TB)
  • For each volume, I configured a single share, without quota limits.  I gave my account access to my backup share, my wife her backup share, and gave everyone access to the general media share

Works like a charm.  My iMac sees the 6TB volume for Time Machine, mounts it as needed, and backs up every hour.  Thanks to the incredible Synology speed, most incremental backups happen in the background in seconds without any noticeable performance lag.  In fact, the original backup of 4.05TB with Time Machine took about 26 hours.  On my NetGear ReadyNAS NV+, that same initial backup took almost a week.

Recommendation: Synology DS1511+

I have to just say that, despite some back and forth over the Time Machine issue, the Synology website, wiki and documentation are all well done.  They are clearly responsive, even responding to my issues over Twitter.  Given the industrial design, features, and performance of the box, I have no trouble recommending the DS1511+ to anyone who’s looking for a large (10TB+) network attached storage solution for backup of a mixed network.

Disclosure: Synology was kind enough to provide me the DS1511+ free of charge given my difficult situation.

How to Extract Short Films from iTunes Extras

This is a quick tip, but somewhat delightful, so I’m sharing it here on this blog.  Credit to DJ Patil for goading me to write this up.

iTunes Extras

Recently, Apple debuted a new feature at the iTunes Store.  When you buy certain movies, typically the more expensive HD versions, you also get the “iTunes Extras”.  The iTunes Extras are basically “everything else” that comes packaged on Blu-Ray and DVD discs: deleted scenes, trailers, exposés on the making of the film, and for certain films (like Pixar movies), short films.

Free the Short Films!

There is a small problem with this system, however.  When you sync your iPod, iPhone or iPad with the library, you don’t get the iTunes Extras.  When you connect with the AppleTV, you don’t see the iTunes Extras.

More importantly, you don’t really want to carry around gigabytes of the extras.  I just don’t need to see “Making Of” clips that often.

Fortunately, it turns out to be an easy problem to solve.

Open the Package

Cracking open the iTunes Extras turns out to be trivial.  In fact, it’s not even cracking – it’s like finding the little red string on a wheel of cheese that makes it trivial to remove the wax covering.  Here are the steps:

  1. Go to the iTunes Extras file in iTunes, and “right click” or “control-click” the file.
  2. Select “Show in Finder” from the menu
  3. You will now see the folder for the movie in your iTunes Library.  There will be a file selected with an “ITE” extension.
  4. “Right click” or “control click” the file.
  5. Select “Show Package Contents” from the menu
  6. You will see a folder inside called “videos”.  In that folder, you will see all the “M4V” files that are the video extras, including the short films
  7. Just copy these files to your desktop.  I use the “Option Drag”, where I hold the option key down, and drag the file to my desktop.  This makes a copy of it on the desktop.
  8. Add the movie to your iTunes, just like any other video.  You’ll have to add the artwork and fix the title, but then you have your short film, separate and synchable, just like any other movie.

You see, the Mac OS Finder has a trick that it inherited from NeXTStep: you can take any folder, mark it a “package”, and the Finder displays it as if it were a single file.  In fact, all the applications on the Mac are delivered this way.  *.app files are really packages (directories) of content, wrapped so that you can click on them as if they are a single file.

The iTunes Extra file is a just a package, and the video files are inside.  More importantly, they are all just “M4V” files, which are MPEG 4 video files that are copy protected with the iTunes DRM.  So they largely work like the main video that you bought off iTunes.

It’s a little extra work to get the correct title, year and cover art on the file, but a quick cut & paste from Google can solve that.

Hope this delights at least one other person out there.  It certainly delighted me this weekend as I was able to free the “Toy Story: Hawaiian Vacation” short film from the new distribution of Cars 2 in HD on iTunes.

Build a Resilient Modern Home Storage & Backup Solution

I’ll admit it, but my home network tends to push the edges of what consumer technology wants to support.  Two months ago, I had one of those terrible technology events that forces you to rethink your entire network: my Netgear ReadyNAS NV+ failed in a disasterous way, causing me to lose my entire iTunes Library.

As a result, I embarked on a process to rethink my offsite data backup and storage solutions for my household, which in this modern age of iPhones, iPads, AppleTVs, and countless media devices has become fairly complex.  Since the solution that I settled on required quite a bit of research, experimenting and simplification, I’m hoping some readers will find it interesting.

Call it: “Adam’s Home Storage Solution, Fall 2011 Edition”.

Overview: Network Design Diagram

You can see above the relevant elements of my home network topology.  It’s anchored to the internet via AT&T UVerse, which provides a 24Mbps down, 5Mbps up service over VDSL.  The router for my home network is plugged into an 8-port Gigabit switch, which is effectively the backbone for the entire house.  As part of the process of revisiting my network, I discovered that historically I had used a mish-mash of old Ethernet cables, some Cat 5, some Cat 5e, and it was affecting some connections.  A quick trip to Fry’s ensured that, for just a few dollars, I had Cat 6 cables for all Gigabit devices.  (This turned out to be important, particularly for connections to my iMac, wireless base station, and NAS box).

Basic Storage Topology

While my network supports a wide variety of clients, the backbone of my solutions is very Apple-centric.  As a result, my solution is optimized for the following decisions:

  • My media store is based on iTunes
  • My primary server is an iMac running Mac OS X 10.7 (Lion)
  • My on premise backup solution is Time Machine

I was able to simplify my storage needs for the network as follows:

  • The iMac uses the built-in 256 GB Solid State drive for the system & applications
  • The iMac uses the built-in 2 TB standard drive for local storage of most media (downloads, documents, pictures)
  • The iMac uses a 4 TB Seagate GoFlex External USB 3.0 drive for the iTunes library
  • The iMac and all other Macs in the house use Time Machine to backup to the Synology DS1511+, which has 8.3 TB usable space.

The Synology DS1511+ has dual Gigabit Ethernet ports, which allows for particularly good performance when multiple machines are trying to read / write to it at the same time.  Configuring the box to support Time Machine for multiple clients is not obvious, but I’ll write up a separate blog post on that issue.

Overall, the performance of this solution is excellent.  iTunes performance from the Seagate is excellent, both for the primary machine as well as for remote devices utilizing Home Sharing to access media (like the AppleTVs).  We are able to run video off this solution to all 3 AppleTV devices simultaneously with no issues.  Copying a 250MB file to the Synology box takes approximately 2 seconds, and it offers no measurable delay in terms of Time Machine incremental backups, viewing, and restoration.  The entire initial backup of 4.05 TB via Time Machine to the Synology box took approximately 26 hours.

Backup Solutions

Let’s not forget that the impetus for this entire redesign was the tragic and unnecessary demise of the Netgear ReadyNAS NV+, causing massive data loss.  Without belaboring the point, I hope that no one who reads this will ever make the mistake of buying a Netgear ReadyNAS.

That being said, it did lead me to significantly reconsider a multi-tier solution for document protection.

I would have loved to go purely with a cloud-based solution, but the performance is just not there yet for multi-terabyte systems.  Not only does it take an inordinate amount of time to upload terabytes to the cloud, but in the case of data loss, recovering the data would be equally slow.  Uploading 400+ GB to the cloud took me approximately 40 days… 4 TB would have taken over a year!

As a result, I factored my content into what I absolutely could not live without.  I settled on my 450 GB of photos and home movies that would be devastating if lost.  For $90, I subscribed to Crashplan Pro, which offers unlimited storage and came highly recommended by everyone.

As a result, for this crucial data, I have 3 levels of protection:

  • Primary storage
  • Secondary backup via Time Machine to Synology RAID can tolerate up to 2 disk failures simultaneously
  • Tertiary off site backup to CrashPlan

For the rest of my data, I have a fairly robust solution, but I’m considering storing 4 TB drive offsite somewhere periodically to add that “tertiary” level of security / safety.

Final Thoughts

The above solution may seem like overkill to some.  OK, probably to most.  However, you can simplify the solution above based on your needs.  For example, if you have only 200 GB of data to protect, maybe CrashPlan is the right “set and forget” solution for your network.  Maybe the 4 TB Seagate drive is sufficient for your Time Machine needs.

For those of you interested in the Synology box, I plan to write up a follow-on post on how to configure the Synology DS1511+ for Time Machine on Mac OS X 10.7 Lion.

Final Solution: Quicken 2007 & Mac OS X Lion

In July I wrote a blog post about a proposed solution for running Quicken 2007 with Mac OS X Lion (10.7).

Unfortunately, that solution didn’t actually work for me.  A few weeks ago, I made the leap to Lion, and experimented with a number of different solutions on how to successfully run Quicken 2007.  I finally come up with one that works incredibly well for me, so I thought I’d share it here for the small number of people out there who can’t imagine life without Quicken for Mac.  (BTW If you read the comments on that first blog post, you’ll see I’m not alone.)

Failure: Snow Leopard on VMware Fusion 4.0

There are quite a few blog posts and discussion boards on the web that explain how to hack VMware Fusion to run Mac OS X 10.6 Snow Leopard.  Unfortunately, I found that none of them were stable over time.

While you can hack some of the configuration files within the virtual image package to “trick” the machine into loading Mac OS X 10.6, it ends up resetting almost every time you quit the virtual machine.  I was hoping that VMware Fusion 4.0 would remove this limitation, since Apple now allows virtualization of Mac OS X 10.7, but apparently they are still enforcing the ban on virtualizing Snow Leopard.  (Personally, I believe VMware should have made this check easy to disable, so that expert users could “take the licensing risk” while not offending Apple.  But I digress.)

You can virtualize Snow Leopard Server, but if you try to buy a used copy on eBay, it’s still almost $200.00.  Added to the $75.00 for VMware Fusion, and all of a sudden you have a very expensive solution.  Worse, VM performance is surprisingly bad for a Mac running on top of a Mac.  In the end, I gave up on this path.

Enter the Headless Mac Mini

For the longest time, you couldn’t actually run a Mac as a headless server.  By headless, I mean without a display.  It used to be that if you tried to boot a Mac without a display plugged in, it would stop in the middle of the boot process.

I’m happy to report that you can, in fact, now run a Mac Mini headless.

Here is what I did:

  • I commandeered a 2007-era Mac Mini from my grandmother. (It’s not a bad as it sounds – I upgraded her to a new iMac in the process.)
  • I did a clean install of Mac OS Snow Leopard 10.6, and then applied all updates to get to a clean 10.6.8
  • I installed Quicken 2007, and applied the R2 & R3 updates
  • I configured the machine to support file sharing and screen sharing, turned off the 802.11 network, turned off bluetooth, and to wake from sleep from Ethernet.  I also configured it to auto-reboot if there is a power outage or crash.
  • I then plugged it in to just power & gigabit ethernet, hiding it cleverly under my Apple Airport Extreme Base Station.  It’s exactly the same size, so it now just looks like I have a fatter base station.

I call the machine “Quicken Mac”, and it lives on my network.  Anytime I want to run Quicken 2007, I just use screen sharing from Lion to connect to “Quicken-Mac.local”, and I’m up and running.   Once connected on screen sharing, I configured the display preferences of the mac to 1650×1080, giving me a large window to run Quicken.

I keep my actual Quicken file on my Mac OS X Lion machine, so it’s backed up with Time Machine, etc.  Quicken Mac just mounts my document folder directly so it can access the file.

Quicken: End Game

This solution may seem like quite a bit of effort, but the truth is after the initial setup, everything has worked without a hitch.  I’m hoping that once Intuit upgrades Quicken Essentials for the Mac to handle investments properly, I’ll be able to sell the Mac Mini on eBay, making it effectively a low cost solution.

For the time being, this solution works.  Mac OS X 10.7 Lion & Quicken 2007.  It can be done.