Vanguard Cuts ETF Fees… Again

Vanguard announced this week yet another reduction in ETF fees on some of their major funds:

Earlier this month, Vanguard shaved its fees on four of its popular ETFs. Those were:

  • Growth ETF (AMEX: VUG), from 0.11% to 0.10%.
  • Value ETF (AMEX: VTV), from 0.11% to 0.10%.
  • Small-Cap Growth ETF (AMEX: VBK), from 0.12% to 0.11%.
  • Small-Cap Value ETF (AMEX: VBR), from 0.12% to 0.11%.

Also, the new Europe Pacific ETF (AMEX: VEA) wound up the year at 0.12%. The fund opened last July and was expected to assess expenses of around 0.15%.

“We originally estimated an annualized expense ratio at higher levels,” said Rebecca Cohen, a Vanguard spokesperson. “But after the year closed out, expenses wound up being less than originally estimated.”

While relatively tiny moves, the latest changes further distances Vanguard’s ETF lineup from the pack. It also brings to 18 the number of different ETFs that Vanguard has cut expense ratios on within the past four months.

The flurry of cost-cutting leaves Vanguard with an average expense ratio at 0.16%. Through year-end 2007, Lipper data showed an average ETF in the U.S. with an expense ratio of 0.53%.

“As ETFs grow in size, they generally become more efficient to run,” said Vanguard in a statement.

As a shareholder-owned company, Vanguard says its “policy has always been to pass the savings from those efficiencies through to investors. The new expense ratios reflect the lower costs of managing these products.”

This is why I am such a loyal customer of Vanguard and Vanguard financial products.  Their entire brand promise is around minimizing management costs for investors, and as a result, they proactively reduce rates constantly.  Unlike other institutions that use low fees as a short term “loss leader” to bring in assets, Vanguard genuinely strives for the lowest costs structure, and passes those savings on to their investors.

The idea that you can now buy an index of small-cap, domestic, growth companies for 11 basis points a year is just amazing.  11 basis points!  That means if you had $10,000 invested, the annual overhead cost would be just $11.   And that’s for a fairly focused index – I believe the broad based US domestic stock index ETF from Vanguard is down to just 7 basis points!

When at all possible, I tend to go with the Vanguard index ETF/Fund.  In fact, since many brokerages (like Fidelity) charge exorbitant commissions on the Vanguard funds,  you can now just buy the ETFs like any other stock.  Pay a cheap commission once, and pay cheap expenses for decades.

Hard to beat a great product with a great cost from a great firm.  Hard to beat.

To Know Elliot is to Love Elliot

We just got through an amazing launch last night at LinkedIn.  New homepage, new site-wide navigation, new Status feature, and countless other small enhancements.

This is a very funny blooper reel that Elliot & Mario recorded while trying to film the video to launch the new Homepage.  It was so funny, they played it for the entire company at lunch yesterday.  He has become cult phenomenon.

I can’t tell you how proud I am to have Elliot on our team.  Enjoy.

Inflation, Hedonics, and How Silicon Valley May Have Wrecked Our Monetary Policy

I read a really interesting book on my trip to Boston last week.  It’s called Greenspan’s Bubbles: The Age of Ignorance at the Federal Reserve, by William Fleckenstein.  I’ve read Bill Fleckenstein’s columns on-and-off since 1999, when I found him through Herb Greenberg.  He’s definitely an intelligent guy, and while he presents like a perma-bear, the reality is that he’s really just a very strong, traditional, bottoms-up fundamentals-based valuation guy.

He has a real axe to grind in this book, but I’m going to do a book review in a separate post.  However, one of the topics he raised was so interesting to me, I had to write a post about it.

Summary: I think we seriously messed up our monetary policy in the 1990s.

To be most specific, I think that in the 1990s, we made a fundamental change to the way we track inflation statistics for the United States that on the surface seems logical.  But unfortunately, the realities about the economics of computers are so extreme, they may have completely distorted the inflation numbers for the entire country.  And if you distort the inflation numbers for the entire country, you run the risk of distorting the monetary policy of this country.  In fact, if you seriously mess up inflation calculations, you also affect fiscal policy, social benefit policy, and even global economic stability.

Yeah, it could be that big.

OK, here’s the information from the book that got me thinking.  It starts on Page 39, in the chapter called, “The Bubble King”.  Fleckenstein explains three changes that were made to the way the US calculates consumer price inflation (CPI) in 1995:

  • Change 1: Move from Arithmetic to Geometric Rates.   Ok, this one is perfectly legitimate.  After all, inflation rates compound year to year, so calculating the rate as a geometric progression is fundamentally correct.  I was actually shocked to find out we didn’t do this before, frankly.  True, at low percentages, arithmatic and geometric calculations don’t always vary alot, but they do vary, and geometric is absolutely the right way to calculate the number.

    For those of you asking what the difference is, let’s use this example.  Say over 5 years, the price of milk goes up 50%.  Arithmetic calculation would say 50%/5 = 10% per year.  The problem, of course, is that if you actually raise the price by 10% per year, you get a lot more than 50% because the price increases compound each year.  In Year 1, you’d go from $1.00 to $1.10, and in Year 2, you’d go to $1.21.  By Year 5, you’d be at $1.61, not $1.50.  It’s just like compounding interest in your savings account.   Geometric calculations take this into account.  Instead of 10%, they would say the inflation rate was 8.45%, which over 5 years compounds to 50%.

    Doing this lowers the number reported, but it’s fundamentally the correct number to report on an annual basis.  So far, so good.

  • Change 2: Asset Substitution.  This one is a little murkier.  Basically, the way that economists calculate inflation for consumer goods is that they take a representative sample of products – hundreds of them.  They then track the prices for these products each year.  If you’ve ever seen those funny articles that track the “price index of the 12 days of Christmas” every year, you get the idea.  🙂

    Asset substitution covers the case where similar goods might be substituted by people if one rises in price more than the other.  Inflation is lower for the person, because instead of buying the high priced item, they buy the lower priced item.  For example, let’s say the basket of goods included a 12-ounce can of soda.  If the price of soda skyrocketed for some reason, most people would not actually spend the money, but would drink less soda and more water.  The extent to which that substitution happens means that the inflation rate is actually lower for people, because they don’t feel the full impact of the rise in price of soda.

    Fleckenstein argues that this change was “truly absurd.” Like a lot of the analysis in the book, that’s a significant exaggeration.  The truth is, the fundamental need for substitution is sound.  But like any of these economic techniques, if abused, this type of power could lead to incredibly huge errors in the calculation of inflation.

  • Change 3:  Hedonic Adjustments.  OK, this is the one that has me worried.  The CBO describes these as “quality adjustments”.  Once again, the logic behind them is sound.  It’s the execution that’s troubling.  Hedonic adjustments account for the fact that if you improve the quality and features of one of the items in the basket of goods, the price might rise due to that increase in feature set, not inflation.  For example, if in 2001 a Honda Civic has 145 horsepower, and in 2004 a Honda Civic has 160 horsepower, then the 2004 Honda Civic actually has 10% more horsepower than the 2001 version.  To the extent that people pay for horsepower, the inflation numbers are adjusted to reflect that part of the price increase in the Honda Civic is due to increase in function, not just inflation.

    Like asset substitution, this could easily be abused, since it involves a judgement call – how much has the product improved vs. how much has the price just risen due to inflation.  It’s a hard line to draw, especially since in 2004 there are no new 145 horsepower Honda Civics around for an apples-to-apples comparison.

So, now that you’ve gotten your fill of Macroeconomics for the day, here’s the part where we may have wrecked our monetary policy.

Moore’s Law

Well, it’s not just Moore’s Law.  It’s the pace of product improvement in the high tech industry, specifically hardware.  It’s huge.  It’s unbelievable.  There has never been a manufactured good like it.  There has never been a manufactured product, like the computer, that doubles in capability every 18 months.  Hard drives double in size.  I bought a 40MB external hard drive in 1993 for $200.  I just bought a 1TB drive for the same price last month.  That’s a 24,900% increase in storage for the same price in 15 years.

Try feeding that through “Hedonic Adjustment” and see what you get.  A huge deflationary element.

Now, that wouldn’t matter, except for one thing:  computers have become a decently large chunk of the US economy.  Not huge mind you.  The US economy is now over $13 Trillion.  Computers are lucky to make up 2-3% of that.  But 2-3% is actually a big number when you start feeding through it ridiculous improvements in “quality/features per dollar”.

Let me jump to page 101 of the book, in the chapter called “The Stock Bubble Bursts”:

James Grant, editor of the always insightful Grant’s Interest Rate Observer, was one skeptic who took the trouble to dissect the complicated subject that Greenspan seemed to accept at face value.  In the spring of 2000, Grant published a study by Robert J. Gordon, a Northwestern University economics professor, who had prepared for the Congressional Budget Office a paper with a shocking revelation:

There has been no productivity growth acceleration in the 99% of the economy located outside the sector which manufactures computer hardware… Indeed, far from exhibiting a productivity acceleration, the productivity slowdown in manufacturing has gotten worse: when computers are stripped out of the durable manufacturing sector, there has been a further productivity slowdown in durable manufacturing in 1995-99 as compared to 1972-95, and no acceleration at all in nondurable manufacturing.

Grant backed that thunderbolt up with another study conducted by two economists, James Medoff and Andrew Harless.  Their contention was that the use of a hedonic price index grossly misrepresented the actual data.

This is bad news.  Bad bad news.

In case you are wondering, the fundamental question that our Federal Reserve and other governmental agencies concerned with the US economy ask themselves is how much of the growth in the economy is due to three factors:

  • Population growth
  • Productivity growth
  • Inflation

If our calculation of inflation is off, it drastically changes our calculation for productivity.  Productivity is the measure of how much economic value is generated from one time-unit of work.  The 1990s were largely heralded as a decade of re-invigorated productivity growth.  It’s why some people think Robert Rubin (or Bill Clinton) were great.  It’s why people believed in a new economy driven by technological progress.

The data above is disturbing.  Yes, it confirms that high tech might have had phenomenal impact on our aggregate numbers.  But it’s totally misleading if it turns out that 98% of the economy was not, in fact, seeing productivity growth.  Worse, it’s possible computers were actually masking continued weakness in every other area.

Look, I’m fairly sure that the people responsible for collecting this data are intelligent, and that this issue has likely been raised already.  It’s also possible that this book and its citations are already known and discredited.

Still, I’m left with the following thoughts:

  1. Is the above data true?  If so, does this mean the 1990s were not, in fact, a real productivity boom for the economy overall?
  2. If these issues are true and known, is the Federal Reserve, Treasury, Congress, et al taking this into account when they make monetary and fiscal policy decisions?  If inflation is understated, then interest rate cuts, fiscal stimulus, and  whole host of other policy decisions could be disasterous.  We could end up with HUGE inflation in everything except computers to make the numbers balance.  (I feel like this is like that line from “The Matrix Reloaded” – the system is desperately trying to balance the equation)
  3. When they make hedonic adjustments for computers, do they take actual utility into account?  Sure, today’s Windows PC is 3x faster than one from five years ago, but the latest versions of Windows & Office are much more resource intensive than five years ago too.  My Mac Plus booted faster than my PowerMac G5.  How do they measure the hedonic adjustment for computers?  Are they grotesquely over-estimating the increased value from hardware improvements, without discounting the resource requirements of software to provide equivalent “utility”?

Feel free to comment if you have pointers to information either confirming or refuting the above issues.  This hits home for me as an issue that ties together two of my strongest personal interests – computers & economics.

Also, feel free to post this blog URL to other boards or forums where experts might be able to answer some of the above questions.

Karl Wiley Joins Motif as President of US Operations

Caught this on my Google News Alert today from PRLog:

Motif, Inc., a leading global knowledge-based BPO services provider announced today that Karl Wiley has joined the company as President of U.S. operations. Mr. Wiley will be responsible for all of Motif’s U.S. based operations, including corporate strategy, sales & marketing, key account management and M&A. He will be focused on driving accelerated growth for Motif by attracting new clients, expanding into additional industries and service lines, and growing activity from Motif’s current client base.

Mr. Wiley joins Motif after more than six years as an executive with eBay. Most recently he served as the Chief Operating Officer of MicroPlace, eBay’s start-up initiative providing a retail investment marketplace in the Microfinance industry. Prior to that, he was the general manager of eBay’s $5+ billion Technology and Media categories, and led eBay’s B2B wholesale initiative. In these roles, Mr. Wiley was responsible for strategy, consumer marketing, product management and customer service, and managed eBay’s relationship with many major branded retailers and manufacturers.

Karl was one of the great eBay Category Managers.  I first worked with Karl when he was part of the Business & Industrial team, which turned out to be an incredible pool of leadership talent.  At the time, Karl was the primary driver & business sponsor for product support for wholesale lots at eBay. For me, it was one of the first projects where I felt like I was truly working on features that were driven by the eBay selling community itself, and not from just internal motivation.  I learned a lot from my efforts with the B&I team, and even after the category management for wholesale lots was disbanded, I still ended up leading the course on Buying & Selling in Lots at eBay Live in 2004 & 2005.  Packed rooms, both times.

Congratulations, Karl, and best of luck with your new venture.

One minor quip, of course, is that it’s time to update your LinkedIn profile

Apple: Feature Requests for AppleTV and/or FrontRow

I’m not sure if anyone from Apple is reading this post, but hope springs eternal.

Listen, I love my AppleTV.  Every week I convert more and more of my movies to MP4 and add them to iTunes.  And I love the fact that FrontRow 2.0 in Mac OS X 10.5 (Leopard) is basically the AppleTV software.  Beautiful.  My Mac Mini, with a 500GB USB 2.0 drive, is an AppleTV on steroids.  Perfect.

I now have over 200 hours of movies and TV ripped for my two Apple boxes.

But there is a problem.  Two, really.  I need to request two key feature additions for the next dot-release of AppleTV and FrontRow:

Request 1: Please add video playlists

This should be obvious, because you already have them on the iPod, but I really need these on the AppleTV.  I have a lot of ripped TV shows and cartoons that are only 5-20 minutes in length.  What I want to do is arbitrarily create a video playlist, and have the AppleTV play continuously some number of shows in a row.  Right now, the device has no playlists.  So either I have to rip a customer MP4 of different combinations, or I have to actually manually play each show individually.

Example: School House Rock

I ripped this DVD.  37 Episodes, each 3 minutes.  Of course, I could not possibly rip the individual shows given the current AppleTV interface.  Instead, I ripped a full two-hour block of all 37 episodes back-to-back.  I then ripped smaller, 8 or 9 episode sequences based on topic.

What I should have been able to do is:

  1. Rip each 3 minute short as a separate file
  2. Create playlists of different groups and sequences of those segments

A lot of great video content is short, and makes sense to view in playlists.  Ripping different combinations into a single video file is wasteful, and clutters the interface.

Request 2: Folders

This one is easy because Tivo already figured this out with version 4.0 of their interface (they are now on 9.x I think).

Let me create folders to group together content so that I don’t have a linear list that goes on forever.  For example, if I have all 6 Star Wars movies, let me create a folder for them.  If I have 10 hours of Band of Brothers, let me group it together.

Right now, the only “grouping” functionality is through the TV Shows interface.  Frankly, that’s pretty clunky.  I’m not even sure I like breaking apart video into movies and TV shows.  I certainly don’t have that breakdown on my Tivo, and I’m not sure I like it.  I’d rather just see TV Shows as a folder of video, sorted by season, then episode, kind of like music that is broken down by artist, and then album.

True, I wouldn’t mind dynamic grouping based on tagged elements of the movies, but that’s actually overkill for now.  I’d settle for good, old-fashioned, manual folders.  A simple directory structure could help me scale the current interface with the remote to handling hundreds of movies, instead of dozens.

So, if you are listening Apple, help me out here.  Thanks.

The Limits of Quantum Computers by Scott Aaronson

I had a business trip to Boston this past week, which means I got a lot of good reading hours in on the plane ride across the country.  As a result, expect to see some intellectually inspired posts this week.

Tonight, I’m going to start off with an easy one – the most recent issue of Scientific American.  It is a great issue.

Actually, three of the articles were blog worthy.  Tonight, I’m going to highlight the great piece by Scott Aaronson called “The Limits of Quantum Computers“.

Here is a synopsis, from the top of the article:

  • Quantum computers would exploit the strange rules of quantum mechanics to process information in ways that are impossible on a standard computer.
  • They would solve certain specific problems, such as factoring integers, dramatically faster than we know how to solve them with today’s computers, but analysis suggests that for most problems quantum computers would surpass conventional ones only slightly.
  • Exotic alterations to the known laws of physics would allow construction of computers that could solve large classes of hard problems efficiently. But those alterations seem implausible. In the real world, perhaps the impossibility of efficiently solving these problems should be taken as a basic physical principle.

Nah, I don’t think that does it justice.

I’ve been following Quantum Computing off-and-on since the mid-1990s.  I took my first Automata & Complexity course at Stanford (CS 154, from Rajeev Motwani) back in 1995.  One of the truly mind-opening courses in the Computer Science undergrad.  Recognizing that there are mathematical frameworks to not just solve problems, but to describe their complexity is fascinating.

Quantum Computing is fascinating because it takes advantage of the truly strange physics of entanglement, a state in Quantum Mechanics where particles can share a matching, but unknown, fate.  A separate branch of algorithmic mathematics has sprung up around analyzing what types of problems, if any, would be simpler to solve on the basis of a computer that leveraged these “Quantum Bits” or QuBits, for short.  At the same time, molecular scientists have struggled to make progress building very small quantum computers.

To date, there are a small number of algorithms that Quantum Computers have been proven to be able to solve significantly more efficiently than traditional computers.  Interestingly, most of them revolve around factoring, which happens to be the one area that we base most of our security algorithms around.  It turns out that factoring a very large number into two primes is very difficult for normal computers, but very easy for quantum computers.

I don’t think I can summarize an 8-page detailed article here, but let’s just say that in this short article, Aaronson manages to:

  • Give a high level overview of basic complexity theory
  • Give a background on what Quantum Computing is, generally
  • Give a background on what makes Quantum Computing different, algorithmically
  • Give examples of the types of problems that QC will significantly improve
  • Give examples of the types of problems that QC will not significantly improve
  • Give interesting mathematical & physics implications of QC algorithmic theory
  • Intersperse the above with incredibly useful diagrams and drawings

Here is my favorite chart in the article – a simple one that maps the changes that quantum computing introduce in the world of algorithmic complexity:

And that’s just one of the sidebars!  🙂  It’s interesting to note that, after scanning this, I discover from Scott’s blog that he had to fight to get that diagram included!

The complexity class inclusion diagram on page 67 was a key concession I did win. (Apparently some editors felt a Venn diagram with P, NP, BQP, and PSPACE would be way too complicated, even for readers who regularly gobble down heaping helpings of M-theory.) As you can imagine, exposing people to this stuff seemed pretty important to me: this is apparently the first time P, NP, and NP-completeness have been explained at any length in Scientific American since articles by Knuth and by Lewis and Papadimitriou in the 1970’s.

Much appreciated, Scott.

Scott Aaronson has his own blog:

and he also runs an online encyclopedia for complexity classes:

And to think, I was just at MIT and missed the chance to meet him. 🙂

The article is not yet fully online, but if you have a chance, I highly recommend picking up a copy of the issue.  Scott has posted an early draft of his article, in PDF, here.  Or better yet, subscribe.  It really is the one scientific magazine to subscribe to if you want to keep up-to-date on a broad range of scientific discovery.

Get LinkedIn on Your iPhone, Now!

It’s live, it’s live!  After weeks in beta, it’s LIVE!

This is just the first release, but already you can:

  • Search LinkedIn from your iPhone
  • View all your contacts and their full profiles, from your iPhone
  • Invite new people you meet, from your iPhone
  • Browse your network updates, from your iPhone

What are you waiting for?  You should immediately:

  1. Use your iPhone to go to:
  2. Hit the (+) button in the middle bottom control bar on your iPhone Safari.  This adds LinkedIn to be one your your default web clippings on your iPhone desktop.  An absolutely gorgeous “IN” logo will grace your iPhone.

For those of you who don’t have an iPhone, this is actually the same URL that serves, our general mobile application URL.  Of course, if you don’t have an iPhone yet, you might want to just add that step above (1) above.

I installed the beta of this application on my wife’s iPhone, and I play with it incessently when we’re on the road.  It’s completely addictive.

Brought to you, with love, from a major web company that develops it’s site exclusively on Mac OS X.

The LinkedIn Wizard is Out!

It’s Thursday night, and in Mountain View right now, great new features and enhancements are rolling out, as usual, to the LinkedIn website.

There is much lore about the origins and purpose of the LinkedIn wizard, but you only see him on Thursday nights, when we’re doing a release that requires downtime.

Right now, I’m in Boston, so I’m missing the release and the dinner and the Rock Band sessions that go with it.  But fortunately, the Wizard is there, to help me feel a little closer to home.

The Subprime Primer in Stick Figures

Two email forwards tonight as posts.  My apologies.

This one will appeal to you finance fanboys out there.

The Subprime Primer

This is a 2.4MB PowerPoint presentation that walks through the basics of the Subprime crisis.  It’s extremely funny, if you are into stick figures that use foul language.  It definitely wins the award for best use of a Norwegian stick figure swearing in a PowerPoint document.  (I will consider others for the award, if you post links.)

Yes, please don’t download if you are offended by any of the seven word banned by the FCC on radio.  And yes, if you watch Deadwood on HBO, you will be more than OK with this deck.

The Worst Appetizer: The Chili’s Awesome Blossom

My wife forwarded a really fun article:

Divine Caroline: The Worst Artery Cloggers in America

It’s a sick type of voyerism, I know.  Dietary rubbernecking.  You look at the nutritional stats and you go, “How could anyone eat one of these things?!?”

Well, in all fairness, I do believe that in younger days I did actually eat one of these: The Chili’s Awesome Blossom.  Check it out:

Worst Appetizer: Chili’s Awesome Blossom
2,710 calories, 203 grams fat

Chili’s is all sorts of wrong. The one and only time I ate there, I almost dove over the table and made the waitress give me my money back, it was so bad. It’s clear now I was just in a salt and sugar-induced rage. Even with their crappy food laden with sodium, fat, and emulsifiers, you’ve got to admire them. How they turned a simple onion into a day’s worth of calories and three days worth of fat is a miracle of food science.

Come on, that’s fairly amazing.  It’s not easy to hide 2710 calories in an appetizer.  Really, it isn’t.  Sure, you could serve someone a bowl of 203g of lard, but would they really eat it (that’s 40.6 teaspoons of Crisco, for those of you counting out there.)  I think it’s a fairly impressive achievement.  Given that it has been on the menu for over 10 years, it must be fairly successful.

I was somewhat suprised to see a Chipotle item on the menu.  Yes, I know anything called a Super Burrito is unlikely to appear on the Weight Watchers guide to Tex-Mex, but this was a little scathing:

Worst Meal in Tin Foil: Chipotle Barbacoa Super Burrito
1493 calories, 68 grams fat (22.5 saturated), 3,644mg sodium, 151 grams carbohydrates, 144 mg cholesterol

The other day my coworkers and I went to Chipotle. It was all shits and giggles while we were eating—“hey, pass the guac!” “want some more chips?!”—but then about an hour later, things went terribly wrong. Our stomachs hurt. Our mouths puckered. I drank about a gallon of water but couldn’t seem to quench my thirst. The bathroom stunk.

I think it’s ironic that Chipotle sponsors a biking team. No athlete would ever eat there.

You should check out the full article.  Lots of fun.

Psychohistory Reaches 450 Posts

Just a quick milestone for this blog.  It’s been a couple of months since I’ve posted any stats.  I don’t have the patience tonight to go into extreme detail, but here are some high level numbers for Psychohistory to date:

Blog Stats

  • Total Views: 272,308
  • Best Day Ever: 4,536 — Monday, March 26, 2007
  • Posts: 450
  • Comments: 1,312
  • Categories: 35

Here is a nice graph that shows page views, month by month, since I launched this new blog in August 2006:

The big spike in March 2007 was due to a flurry of interest in my Battlestar Galactica posts, which remain to this day the most popular posts on this blog.

2008 Economic Stimulus Tax Rebate Calculator

Found this post on Lifehacker today. It’s actually just a pointer to this calculator on Consumerism Commentator, which lets you enter your 1040 numbers from 2007 (if you’ve done them yet) and figure out how much you are (or aren’t getting).

Economic Stimulus Tax Rebate Calculator

Personally, I’m exceptionally disappointed with the “bi-partisan” stimulus plan that was negotiated by the White House & Congress.  There is a time and a place for fiscal stimulus, and a time and a place for social programs.  But mixing the two rarely leads to good policy.

The Wall Street Journal had an article today that estimated that roughly 50-70% of the rebate money would end up in consumption.  Previous blog posts have argued that the 2001 stimulus rebate, which was similar, was mostly ineffective.

Anyway, what’s done is done.

Blu-Ray vs. AppleTV HD vs. Comcast HD vs. DVD

This is the absolutely best user-based review of the various high definition digital movie formats out there.

Apple TV 2.0 vs. Blu-Ray, DVD & HD Cable: The Comparison

Very fair, very balanced.

More importantly, this is the first review I’ve seen that doesn’t focus on the technical specifications and argue theoretically about which format should be better.   Instead, this article provides actual example frames and examples from a viewer perspective in a very realistic setting.

Gorgeous screen captures and zoomed/cropped images highlight the text along the way.

Definitely a must-read.  But, in case you find it too long, here is my take-aways:

  • Blu-Ray wins, but only matters if you actually have a 1080P set.
  • AppleTV HD is surprisingly good, despite being 720p.
  • Comcast HD seems to be the loser here, although color is better.
  • All 3 are noticeably better than upconverted DVD (once again, on a 1080P set)

The article ends up arguing that Netflix may have the best model here… but I’m not 100% sure.  I’m currently in a movie buying freeze right now as I debate what form my HD library will eventually take.  It’s hard for me to consider a format that I can’t use freely on multiple devices (Blu-Ray), but it’s also hard for me to consider a rental option where I don’t have the movie at my disposal.  A Blu-Ray disc that comes with a DRM-protected iTunes MP4 would work for me, since I’d happy pay the $20 per disc to have a physical copy and the freedom to have a file that I can use on any device.

Too bad no one is providing that… yet.

Mozilla Firefox 3 Beta 3: First Impressions

Got to be careful what I say here.  Mike Schroepfer might be reading. 🙂

Actually, I was reading his blog when I found out that Firefox 3 Beta 3 is out.  You can download it here.

I’m playing the naive user for now… just installed it and using it, without reading up on the specifics of the new features.  I’m trying to see what I actually notice without any prep.

First thing… it’s FAST.  Much faster than Firefox 2.  And much much more stable with lots of tabs left open, although I need to give this a bit of a test through the weekend.

One of my biggest problems with Firefox 2 has been based on my particular usage habits.  I tend to open a lot of web pages in tabs, and leave them open for days (or weeks), as reminders to either read the pages or blog about them (or both).  What I’ve noticed is that once I get a large number of open tabs (20+), Firefox starts lagging my entire machine.  I don’t have the fastest machine in the world (PowerMac G5, Dual 2.5Ghz, 2.5GB RAM), but I’m pretty sure it should be able to display 20+ webpages at one time. 🙂

Anyway, everything is faster with Firefox 3.  My eBay loads faster.  SYI 3 loads faster. loads faster.  Email links that open URLs in Firefox open faster.  And when I launch with a dozen or more tabs, it feels much more stable, not locking up nearly the way that Firefox 2 did.

I’m noticing on Mac OS X (10.5) that the controls look a little goofy.  The small controls used on eBay now come out as Mac-like round buttons, but the font is off-center.  Also, the drop-down menus actually have their text one pixel below the end of the menu control.

This is stuff I’m sure that’ll get fixed by final release.

We’re obviously going to have get busy updating our LinkedIn toolbar – Firefox 3 informed me the current version isn’t compatible.  I use that toolbar every day, so I’m going to have to make sure that gets fixed.  🙂  In fact, none of my toolbars were verified to work with Firefox 3, which is probably a good thing since I don’t use most of them anymore anyway.

I’ve been very happy with Firefox vs. Safari since I switched about two years ago.  I was debating whether Safari 3 and the rise of the iPhone meant I would eventually have to switch back to Safari as my primary browser.

It’s not final, but my first few hours on Firefox 3 has left me fairly confident that Mozilla will continue to be my browser provider of choice for the foreseeable future.

You know, I just realized that Mozilla’s success making a great web browser for the Mac proves the lie in Microsoft’s excuses for abandoning the platform.   Firefox proves:

  1. That a great web browser can be built as a stand-alone application, not as a component of the OS.
  2. That a great web browser can be built on the Mac by a company other than Apple.

These were, of course, the two nominal reasons that Microsoft gave back in 2002 for dropping Internet Explorer on the Mac.

As Apple market share continues to grow, and the concept of an all-Windows workplace fades, I have to wonder – will Microsoft ever reconsider providing IE as a cross-platform browser again?   Even if the Mac has a low (5%) market share, that doesn’t mean only 5% of companies will have Macs deployed.  It could turn out that a vast majority of companies end up with a minority share of Macs in-house.  Does Microsoft really want to cede the cross-platform web application market to Mozilla?

Somehow, I doubt this is being seriously considered in Redmond.  But it’s definitely interesting in the face of a resurgent Mac platform and a cross-platform Firefox & Safari.  Internet Explorer for the iPhone, anyone?