Beyond Cool: Striped 120GB SSD RAID in a Macbook Pro

From time to time, I post the technical exploits of my friend Eric here.  I remember the attention he got a while back for hacking his MacBook Pro to support a RAID configuration.

Well, Eric has managed to extend that experimentation to a pair of new OCZ 120GB Solid State Drives (SSD).

Two OCZ Core Series v2 SATA II 120GB SSDs in a MacBook Pro

The blog post is here, with detailed photos and benchmarks.  A must see for any digital photographer and/or Mac geek who is into performance-pushing customer expansion.

My favorite part of the walk through is the brief commentary on the Apple-like packaging for the SSD drives:

The OCZ drives arrived in a plain package, but once the outer cardboard layer was removed, it was clear that OCZ had taken some packaging cues from Apple. The inner packaging was beautiful, and made it clear that you had just purchased a quality product.

That was the part I expected.  This is the part I didn’t:

Even though it was pretty, I don’t like excessive packaging and would have preferred something simple and biodegradable.

For some reason, I have a distinct mental image of Eric’s facial expression when saying this, and it made me laugh out loud.  🙂

The Latest Large Prime Discovered: 2^43,112,609 – 1

From Science News:

Here’s a number to savor: 243,112,609-1.

Its size is mind-boggling. With nearly 13 million digits, it makes the number of atoms in the known universe seem negligible, a mere 80 digits.

And its form is tidy and lovely: 2n-1.

But its true beauty is far grander: It is a prime number. Indeed, it is the largest prime number ever found.

The Great Internet Mersenne Prime Search, or GIMPS, a computing project that uses volunteers’ computers to hunt for primes, found the prime and just confirmed the discovery. It can now claim a $100,000 prize from the Electronic Frontier Foundation for being the first to find a prime number that has more than 10 million digits.

Don’t worry prime hunters, there are prizes still to be claimed:

The Electronic Frontier Foundation became interested in prime hunting because it makes an excellent challenge problem for cooperative, distributed computing. “The award is an incentive to stretch the computational ability of the Internet,” says Landon Noll of Cisco Systems Inc., one of the judges for the Electronic Frontier Foundation prize and a discoverer of a former biggest known prime. More prizes remain to be claimed: a $150,000 award for a prime with 100 million digits, and a $250,000 award for one with a billion digits.

In case you are wondering why I’m posting this here on my blog, I do have some personal historical trivia that makes the issue of large primes sentimental for me.

The first job I ever had writing software was an unpaid high school internship at NASA Ames Research Center, here in Mountain View.  My project was to build a simulation model to evaluate error rates for different fluid dynamics algorithms.  In order to do the project, which was executed on a Cray X-MP supercomputer, I had to learn Fortran.

The sample project I chose to do to learn the language was a simple program to take as input a Mersenne Prime, and then generate the actual digits for the number in a large output file.

As a side note, this was the first time I also ever became familiar with the operating costs of these type of high end systems… I remember being fairly shocked when the scientist I was working with explained to me that my program had taken several hours of Cray time, which was billed at about $2,000 per hour.

Of course, I’m fairly certain that my new 8-core Mac Pro is significantly faster than those old Cray supercomputers… 🙂

2009 Lincoln Cent Designs Unveiled

This past week, the US Mint published updated material on the new, 2009 Lincoln Cent program, which will celebrate the 100th anniversary of the coin, and the 200th anniversary of the birth of Abraham Lincoln.

From the US Mint website:

In 2009, the United States Mint will mint and issue four different one-cent coins in recognition of the bicentennial of President Abraham Lincoln’s birth and the 100th anniversary of the first issuance of the Lincoln cent. The reverse (tails) designs were unveiled September 22 at a ceremony held at the Lincoln Memorial on the National Mall in Washington, D.C. While the obverse (heads) will continue to bear the familiar likeness of President Lincoln currently on the one-cent coin, the reverse will reflect four different designs, each one representing a different aspect, or theme, of the life of President Lincoln.

The themes for the reverse designs represent the four major aspects of President Lincoln’s life, as outlined in Title III of Public Law 109-145, the Presidential $1 Coin Act of 2005:

The new one-cent reverse designs will be issued at approximately three-month intervals throughout 2009. The Secretary of the Treasury approved the designs for the coins after consultation with the Abraham Lincoln Bicentennial Commission and the Commission of Fine Arts, and after review by the Citizens Coinage Advisory Committee.

For collectors, there will be a variety of coins.  You’ll likely see each of the four cents from both the Philidelphia and Denver mints (“P” and “D” mint marks).  It also looks like there will be true copper versions, with the same metal content as the original 1909 penny, from the San Francisco mint (“S” mint mark).  That’s 12 coins, at least.

For those who are interested, here are the four designs:

I don’t expect a lot of collector activity, largely because of the low nominal value of the coin.  Since there are always active movements to get rid of the penny, this might turn out to be the last hurrah for the one cent piece.

I wonder what the US Mint will charge for a roll of these pennies?

Stanford CS193P: iPhone Application Programming Launches Tomorrow

A little too busy tonight for a long blog post, but thought I’d share how excited I am to be helping assist the launch of a new course at Stanford this Fall:

CS 193P: iPhone Application Programming

The class website is still a work in progress, but it will come along.  The course is open to Stanford undergrad and graduate students, as well as through the Stanford Center for Professional Development (SCPD) on video.  Enrollment is limited, and my guess is that it will be oversubscribed.

A wonderful opportunity for me to dust off the old Objective-C skills, and help give back to the Stanford community.  Launching new courses is always exciting, and I feel very lucky to be involved with this one in particular.

It might sound crazy to take this on in addition to the full load at both work and at home, but I’m excited to get back involved with teaching, and that’s worth the potential sleep deprivation for the quarter.

We Are Living History

Imagine my surprise.  Sunday afternoon, I got on a plane to Orlando, FL.  When I got on the plane, Lehman Brothers and Merrill Lynch still existed.  When I got off the plane, I checked my iPhone and saw that Merrill was now part of Bank of America, and Lehman was going Chapter 11.

Bear Stearns.  Lehman Brothers.  Merrill Lynch.  Fannie Mae.  Freddie Mac.  AIG.

Ongoing discussion now about Goldman Sachs and JP Morgan.  Fundamental problems right now with any business who basically borrows short and then lends long, at high leverage.

Really unbelievable.  Truly historic times.

I almost finished reading Conscience of a Liberal, by Paul Krugman, on the plane trip back.  When I’m done, I’ll post a book review here.  Krugman is a smart economist, but he’s become rabidly political of late.  Still, a number of very interesting insights in the book.

One thing I definitely agree with is that the meltdown going on right now will be studied in history textbooks, the way that we studied the Roaring Twenties, the Great Depression, and the New Deal.  My guess is that the story will go something like this:

  • The End of the Cold War (1982-1992)
  • The Twin Bubbles (1993-2006)
  • The Great Crash (2007-2008/9)
  • The Way Forward (2009+)

It’s interesting to think about, since of course the history hasn’t been written yet.  And every day brings new surprises.

Alan Greenspan is Right on Fannie Mae & Freddie Mac

The incredibly historic economic news keeps coming this week.  Truly momentous.  It’s as if every article, every book, every course I’ve ever taken in modern economic history and theory was to prepare to understand the events of the past 12-24 months.

In some ways, I think I’m in shock.  It’s like watching history in the making.  History that will be the subject of textbooks for decades to come.  It’s really unbelievable.

After 70 years, we’ve come to the realization that yes, in fact, you cannot keep the benefits of a private company with public guarantees without paying the price at some point.

To rephrase, for decades, politicians from all parties have been in awe of the magic of Fannie Mae and its brethren.  Born out of the Great Depression, and spun off to raise funds for Johnson’s Great Society projects, it seemed to good to be true:

  • Private investors provide capital to add liquidity to the mortgage market
  • Home buyers get cheaper rates
  • Investors get “completely safe” securities that pay slightly more than Treasuries
  • The company generates huge profits on the debt spread between their borrowing rates and mortgage rates/defaults

Well, now we know that it was, in fact, “too good to be true”.

There is a lot to be concerned about in the Paulson plan.  It’s not at all clear why the sub-debt holders were left whole.  It’s not at all clear why the shareholders were left with 20% of the company.

Given the magnitude of the problem and the unpredictability of the large number of parties and variables involved, however, I’m willing to assume that Paulson didn’t optimize for the “best” deal, but for the most pragmatic and least risky in the near term.

(By the way, if you are looking for details, the New York Times pieces here and here have the best write-ups I’ve found to date.)

My biggest fear, at this point, is that the plan really defers the final solution to this problem until the next administration, when hopefully we’re through the worst of this.  It sounds pragmatic, but in reality, it makes it much more likely that by then the crisis will have past, and Republicans and Democrats will retreat to their historic disfunction on the topic.

I’ve read a multitude of proposed solutions, but in this case, I have to say, “Please listen to Alan Greenspan on this one.”

Yes, I know bashing Greenspan has become popular.  I’ll address that in another blog post – I had the opportunity to read his recent auto-biography, and I thoroughly enjoyed it.

Try to ignore the bashing for now, and just focus on his recommendations for Fannie, Freddie, and their ilk.

Here is a WSJ story that summarizes his recommendations, made earlier this year:

His quarrel is with the approach the Bush administration sold to Congress. “They should have wiped out the shareholders, nationalized the institutions with legislation that they are to be reconstituted — with necessary taxpayer support to make them financially viable — as five or 10 individual privately held units,” which the government would eventually auction off to private investors, he said.

Instead, Congress granted Treasury Secretary Henry Paulson temporary authority to use an unlimited amount of taxpayer money to lend to or invest in the companies. In response to the Greenspan critique, Mr. Paulson’s spokeswoman, Michele Davis, said, “This legislation accomplished two important goals — providing confidence in the immediate term as these institutions play a critical role in weathering the housing correction, and putting in place a new regulator with all the authorities necessary to address systemic risk posed by the GSEs.”

But a similar critique has been raised by several other prominent observers. “If they are too big to fail, make them smaller,” former Nixon Treasury Secretary George Shultz said. Some say the Paulson approach, even if the government never spends a nickel, entrenches current management and offers shareholders the upside if the government’s reassurance allows the companies to weather the current storm. The Treasury hasn’t said what conditions it would impose if it offers Fannie and Freddie taxpayer money.

He’s right, and it’s not too late to move in this direction.

There is no reason for Fannie Mae & Freddie Mac to continue in their current forms.

The government should regulate strictly the requirements for securitizing and guaranteeing mortgages, the way that they regulate commercial banking, deposits, and other types of financial business.  They can define specific types of mortgages, even give the types names to make it easier for consumers to comparison shop, and let the “Baby Fannies” compete to make markets in them.

By breaking them up, and auctioning them off to the mega-banks, both domestic and international, they guarantee a distributed system that will be extremely fault tolerant to the failure of any one entity.  If they structure the regulation properly, they can turn this business into a stable, predictable, profitable business.

Gone is the government guarantee.  Gone is the lobby machine.  Gone is the too-big-to-fail entity.

I think Rep. Barney Frank (D, New York) is an example of why I’m afraid this won’t happen:

In the House, Mr. Frank, the chairman of the House Financial Services Committee, criticized the administration’s attempt to shrink the companies. He staunchly defended the companies’ ability to channel some of their profits from conventional mortgage financing to subsidize the construction of affordable rental housing and lower borrowing costs for low-income home buyers.

Mr. Frank seemed confident that he could stop the effort by the administration to ultimately shrink the companies through its rescue plan over the long term.

Catch that?  What Mr. Frank likes is the fact that instead of getting Congress to agree to fund affordable rental housing programs, which would have to be paid through taxes or spending cuts elsewhere, he liked having an “off the books” slush fund to pay for these projects.  He’s still at it:

After repeated clashes with the White House over legislation that authorized the Treasury to bail out the companies, Mr. Frank succeeded in including a provision that required Fannie and Freddie to divert some of their profit from buying up “jumbo” mortgages for expensive houses into a fund for affordable rental housing.

Great.  After all, passing a law to force Fannie Mae to spend money on a program doesn’t cost the taxpayer anything, does it?

Well, it does.  Those strings came at a price.  The price was the implicit government guarantee.

And now we have a better idea of what that price really was.  And it’s not worth it.

How Apple Should Handle NBC

Just read this piece on the “he-said/she-said” debate between NBC executives and Apple executives:

Apple Refutes NBC’s Pricing Policy Claims

80/20 Apple is right here, of course.  They did not give NBC any pricing priviledges that they didn’t also give to all video content producers.  However, Apple did introduce it’s first pricing variation for HD with the AppleTV rentals earlier this year, and that’s more than it had last fall when NBC pulled its content.

I think, of course, that Apple is avoiding the most obvious solution to its problem:

  • Buy NBC off General Electric for a fair price
  • Fire at least the top 2-3 levels of executives at NBC
  • Set in place a modern digital content strategy
  • Execute non-exclusive, but solid digital content contracts with Apple
  • Take the new NBC public or sell it

As a side benefit, they could really have some fun with MSNBC.

What’s the point of Apple having a $135B market cap if they don’t use it?   My guess is that if executed properly, the above strategy could increase the value of both NBC and Apple.  Worst case, the upside on Apple is likely greater than the downside for NBC, making the “investment” worth it.

John Lilly, Mozilla Organization Talk at Stanford

John gave a great presentation today at Stanford about Mozilla.  He’s graciously shared it on Slideshare, so I’m sharing it here as well.

A few bullets to think about:

  • How distributed is the decision making in your organization, really?  How much do you empower small, cross-functional teams to execute?
  • How much does your organization really encourage active discussion, debate, and communication?  Does that discussion, debate and communication end within your company walls, or does it extend to your broader community?
  • How dependent is your organization on the “chain of command” vs. recognized experts and groups both within and outside your organization?
  • Does your organization understand the difference between inclusive discussion and democratic decision making?

In the final slides, there are a couple bullets I’m going to have to ask John about tomorrow:

  • Encourage transparency of decision making
  • Avoid democracy/consensus expectation setting
  • Lead, but don’t command

I’m not sure I fully understand the interplay between these in all cases.

The presentation is definitely worth reading if you are interested in Mozilla or distributed organizations.  It’s also worth reading if you want to be able to use the word “chaordic” in a sentence.

Welcome to World of Good

Seema may be a pretty miserable blogger, but she’s a great product manager.  And her site just went live last week.

Congratulations to the team, and welcome worldofgood.com.

World of Good is an attempt to produce the first, global-scale marketplace for socially beneficial goods.   Yes, when you shop the site you will see badges for:

  • Eco-Friendly
  • People Positive
  • Animal Friendly
  • Supports a Cause

It’s a nice initiative because it combines some of the raw, positive economics from aggregating demand for these poorly distributed goods, allowing many of the vendors to reach buyers they otherwise would be unable to find.  It’s a classic eBay play to try and make an inefficient market more efficient.

I’m not sure of the overall business opportunity here for eBay, but it’s great to see this two-year effort pay off for Seema and the team.  Congratulations.

Goodbye, Bid-O-Matic

A few weeks ago, I wrote a Eulogy to eBay Express here on this blog, and it rapidly became one of my most popular posts ever.  (Of course, nothing quite competes with the Battlestar Galactica posts, but I digress…)

Last week, eBay quietly announced the death of Bid Assistant, a product concept that I remember fondly from my days at eBay, and I thought it would be worth a few minutes to reflect back on lessons from the life span of that effort.  The truth is, while eBay gets a lot of press coverage from both the traditional media and from bloggers, I see very little, if any, actual detailed discussion of the features themselves, whether good, bad or ugly.  Usually, you just see factual reports, like this.

Bid-O-Matic, the original concept behind Bid Assistant, is an idea that goes back to at least 2005, if not earlier.  The problem it was attempting to solve is pretty much as old as auction bidding on eBay:

  • As a buyer, you often find several auctions for the item you are looking to buy, at various stages of completion.
  • If you bid on only one auction, the price of that auction might go too high, and you might have missed out on one of the other auctions.
  • If you bid on more than one auction, then you run the risk of winning more than one item.

eBay, of course, frowns on retracting bids, let alone backing out of a completed winning bid, so it’s a difficult situation to handle.  If you talked to any of the regular auction buyers on eBay, they would give you a personal story relating to this problem.  Try bidding on a digital camera some time, and you’ll feel the issue pretty quickly.

Enter Bid-O-Matic.

Bid-O-Matic was supposed to be the first step in building a true eBay assistant for bidding.  You, as a buyer, would pick out a list of equivalent items to bid on.  Bid-O-Matic would then place bids for you, attempting to win exactly one of the items at the lowest possible price.

That was the idea, anyway.  Like many great product ideas, it had its roots in a real customer problem;  a customer problem expressed in earnest by some of eBay’s best customers, it’s regular auction buyers.  And it was a classic case where technology could dramatically improve the customer experience.

And like many a road to hell, it was paved with good intentions.

Bid-O-Matic originally failed to get traction within the company, largely because the cost of building the feature did not seem to justify the incremental improvement to the eBay business.  The problem mathematically is that frequent auction buyers actually already buy a lot, so it was hard to see how this tool would really help them buy that much more.  In addition, the problem is unique enough to advanced users that it was hard to imagine that many auction buyers who weren’t regular buyers adopting the tool.

Bid-O-Matic stayed just a concept, until renewed focus on improving the auction experience really took hold in 2006 as part of the “eBay 3.0” concept.  Bid-O-Matic seemed like the perfect example of a feature that eBay’s best auction buyers would love, and so despite the numbers, the feature was given the green light.

Without going into too much gory detail, after much pain, schedule changes, cost increases, design compromises, and a typically horrific naming process, Bid Assistant was born.

While I was a huge fan of the initial concept, and of the people who worked on it, as a user I was never really able to engage with Bid Assistant.  It required a fairly arcane knowledge of “Watching”, the eBay process for bookmarking auctions.  The integration points were also fairly tortured – there was very little in the actual Finding and Buying experiences to lead you to discover the Bid Assistant.  Worse, I think fixed price listings severely limited the potential benefit of the feature.  Bid-O-Matic was never useful for multiple, unique, one-of-a-kind collectibles.  And if you are buying a commodity item, like a specific model of digital camera, then just buying it on eBay Express (or Shopping.com or Amazon.com) made much more sense.

Like all Product professionals, features like Bid-O-Matic leave me torn.  On the one hand, I want to say that there was a real user problem here, and that with the right research, design inspiration, and iteration, eBay could have come up with a great product here.  On the other hand, that time and effort is expensive, and there are likely much more important problems eBay could be putting that effort towards.

In any case, I just want to say goodbye to the Bid Assistant, and a brief acknowledgement to the team that built it.  Better to have tried and failed than to never have tried at all.

Have We Crossed the Uncanny Valley?

Just for reference, the “Uncanny Valley” is not some cute comment on life in Silicon Valley – it’s a popular concept in computer animation that refers to the challenges inherent in trying to produce “realistic” computer animation of human characters.  I wrote a blog post on the concept back in 2006:

Playstation 3, Uncanny Valley & Product Design

Uncanny Valley is a theory borrowed from robotics that says that when you have something relatively non-human like a puppy or a teddy bear, people will anthropomorphize it and like the “human-like” qualities of it.  However, if you make something too close to human, like a robot, people start to dislike it strongly as they focus on some key, missing detail.  Think about the uneasy feeling around corpses, zombies, or prosthetics.

Well, much to potential delight of 30 Rock fans, we may be closer to crossing that valley than we thought.

Meet Emily.

uncannyemily.jpg

Emily is not real.  She is computer animated, leveraging new techniques for incorporating involuntary eye movement and other incredibly subtle cues from a real actress to generate a realistic effect.  She still comes across as a bit stiff, but not in an unnatural way.

Here is an explanation from the original article in the Times UK:

Researchers at a Californian company which makes computer-generated imagery for Hollywood films started with a video of an employee talking. They then broke down down the facial movements down into dozens of smaller movements, each of which was given a ‘control system’.

The team at Image Metrics – which produced the animation for the Grand Theft Auto computer game – then recreated the gestures, movement by movement, in a model. The aim was to overcome the traditional difficulties of animating a human face, for instance that the skin looks too shiny, or that the movements are too symmetrical.

“Ninety per cent of the work is convincing people that the eyes are real,” Mike Starkenburg, chief operating officer of Image Metrics, said.

If the historical pace of innovation in this area is any indication, we are likely less than three years away from seeing this type of technique utilized in a mass market short medium (commercial, animated short, small film segment) and within five years of seeing this used in a long medium (video game, television show, full length feature).

Amazing.

On a related note, this concept of more intense real-life movement capture to drive computer animation seems to be taking hold aggressively in commercial entertainment as well.  My son’s new favorite show is Sid the Science Kid, which is a new innovation from the wizards at The Jim Henson Company.  It uses a real-time motion capture from a live actor to generate a computer animated special that can be produced in real-time.  A fascinating blend of puppetry techniques and computer animation makes it possible, and the result is a computer animated character who presents realistic faults and behavior on screen.  Here is the Muppet Wikia entry on the show.

It stands to reason that as more performances are captured, and computational storage and processing power increase, it will be relatively trivial to assemble a library of realistic behaviors and actions that will generate truly realistic, but completely artificial, performances.

An Obama Article During the Republican Convention

I generally don’t write about politics here on my blog, largely because I tend to be more issue-oriented than party-oriented, and that seems to bring out fire from both sides of the aisle.

Right now, I’m hopelessly behind on keeping up with the conventions – I’ve downloaded all the speeches from the DNC, but haven’t watched them yet.  Similarly, I haven’t yet watched a single minute from the RNC.

Strangely enough, however, I was forwarded a link to an article Marc Andreessen wrote about meeting Barack Obama in March 2008 that is worth reading:

A Hour and a Half with Barack Obama

Marc’s blog, by the way, is the blog that most closely resembles what I wish my blog could be.  Or should be.  Most of the articles are deep, interesting, sharp, and reflect frank advice and perspective that you don’t typically find in either professional news or popular blogs.  Worth subscribing to if you don’t already.