Posted by Brian
The Innovator’s Dilemma: When New Technologies Cause Great Firms to Fail
By Clayton Christensen
How do successful companies fail? Often the answer is obvious: poor management or an economic downturn are two common culprits. More interesting are the companies that seem to do everything right, but a few years later are in a steep decline. How is it possible that management which only a few short years ago was being lauded as a model for the industry can come to be regarded a blathering idiots with no clue as to what their customers want? In The Innovator’s Dilemma Clayton Christensen puts forth that this is caused by the introduction of disruptive technology into the market.
What is disruptive technology though? Christensen defines it as a technology which has worse performance, at least in the near term, in what has been considered the key market measurement but which is still considered acceptable. It trades off max performance in this key measurement for features that customers outside of what has been to now the core market care about more. He uses the disk drive industry as his major example throughout the book. When a new size disk drive was developed the established players repeatedly ignored it because its storage capacity wasn’t interesting to their current customers. However, its smaller size was interesting to a new market and because technological innovation often moves faster than market demand, eventually the disruptive technology is able to displace the existing technology, along with the companies pushing it. He explains a similar process with hydraulics in excavation, minimills in steel production, and discount retailers.
I was pleased to learn that The Innovator’s Dilemma is often used in MBA programs now, although I do wonder how it is received by both faculty and students. The idea that good management can be a direct cause of failure is probably a non-intuitive and disquieting thought to many of them. Christensen is an engaging writer with the data to back up his theory. Highly recommended.
Posted by Brian
The Mythical Man-Month
By Fred Brooks
The Mythical Man-Month is the most influential software project management book ever and I am not conceited enough to think that I will have any original thoughts on its contents, but here goes.
The most widely known item to come out of this work is Brooks’ Law, which states that adding manpower to a late software project makes it later. This is a very counter-intuitive statement to the non-programmer, but makes perfect sense to those who practice the craft. Every project is different and the time it takes new team members to get familiar with the project pushes the rest of the project behind schedule even more. This of course causes management to add even more manpower and the project enters a death spiral.
He also gave a citation of the oft-mentioned 10:1 programmer productivity ratio, which states that an excellent programmer can be ten times as productive as a mediocre one. It comes from a study by Sackman, Erikson, and Grant.
While the core is still strong, around the edges he book is showing its age. Whenever Brooks talks about specific technologies, programming languages, or methodologies everything feels dated. At this time it was still a big debate as to the usefulness of languages like FORTRAN over assembler. He also pushed what I felt was a heavy-handed organizational concept that he called a “surgical” team. The team involved a large number of support people around a a central architect who made all the final decisions. A number of these people wouldn’t be needed today because of technological advances, but I still wasn’t sold on the idea.
Overall, this is of course a must read for any programmer or manager of. It is important that Brooks is talking about management in the context of very large projects though. My work comes no where near to the scope he is used to and the ideas can seem heavy-handed because of this. With that being said, any book that has this much staying power has to have something to it. While everything here is not gold, a large chunk still applies.
Posted by Brian
Open Sources 2.0: The Continuing Evolution
Much has changed in the world of open source since the publication of the original Open Sources in 1999. Open Sources 2.0 is another collection of essays that discuss many of these changes, touching on a wide array of topics in open source software and in the application of open source principles in other areas. As with any collection of essays there is no coherent narrative, so I will discuss several essays that I enjoyed. This review would be very long if I discussed all of them as only a few were uninteresting to me.
Jeremy Allison, a lead developer of Samba, contributed an article comparing the POSIX and Win32 standards. POSIX is a series of related standards that defines the interface for a UNIX operating system, while Win32 is Microsoft’s de facto standard API for Windows programming. Allison’s Samba experience makes him uniquely qualified to compare the two. He gives a balanced take, looking at one bad and one good feature of each. For POSIX he looks at the horribly broken locking calls (AT&T submitted a terrible proposal in the standardization process and nobody else cared to complain) and its impressive record of future proofing. For Win32 he gives kudos for excellent Unicode support before any other vendor, and a big thumbs down for a powerful, but overly complicated security model. It’s actually possible to write very secure code with Win32, but it is so difficult to use the security model that nobody really wants to. He concludes by advising you to stay away from de facto standards.
Andrew Hessel contributed an essay focused on applying open source principles to biotechnology. He describes how current IP practices have contributed to rising drug development costs and waves of mergers and acquisitions as firms try to consolidate fragmented IP. He makes an interesting observation that DNA is binary code as only two base pair combinations are possible and extrapolates upon that to wonder if an author of synthetic DNA could copyright OSS style. Very few biotechnology companies now can produce and market and product profitably so the value of proprietary IP may be overstated in this industry. I don’t hold out much hope for this approach being adopted by any large players.
Overall, this collection of essays is extremely thought provoking. The focus here is not code, but the issues surrounding open source software. By extension several essays also look at how thee principles can be applied in other areas. Highly recommended.
Posted by Brian
The Age of Spiritual Machines: When Computers Exceed Human Intelligence
By Ray Kurzweil
My 52nd book of the year was Ray Kurzweil’s The Age of Spirtual Machines. Ray Kurzweil is an inventor and futurist who is best known for his ideas on the technological singularity. Despite the fact that theory states that after a certain point it is possible to predict the future that is precisely what Spiritual Machines attempts to do. The crux of Kurzweil’s argument is the Law of Accelerating Returns, which states that the pace of technological innovation is increasing at an exponential rate. By projecting these advances out for a hundred years he draws some startling conclusions about what life will be like by 2099. Overall, this book was very good. It was published in 1999 and Kurzweil makes many predictions for 2009. Instead of writing a coherent review I will instead list some of these and see how he did. There are a ton more in this book, but I am going only going to hit on some of them. Remember, these are all predictions made in 1999 for 2009.
Supercomputers will reach human brain speed.
The human brain can perform 20 million billion calculations/second. Supercomputers hit broke the petaflop barrier this past year. That means that the human brain still performs 20x more calculations/second. We are not there yet, but the goal is within reach. Kurzweil was probably a few years off with this prediction.
This one was blown away. In 2007 MRI resolution of 90 nanometers was achieved and resolutions of 3 microns are standard.
Virtual sex with a real person with full visual and auditory realism.
Not. Even. Close. Virtual reality is routinely predicted by futurists, but nothing ever comes of it.
Implementation of self-driving cars feasible
Progress has actually been pretty good with this. The DARPA Grand Challenge provided some high visibility to the technology and got good media exposure. Kurzweil only predicted that it would be feasible, which is the case. He probably realized the massive hurdles in infrastructure costs and liability for injury that will delay deployment for many years to come.
Most people have at least a dozen computers on them at all times networked into a “body LAN”
Nope. There are some small sub-cultures that promote living constantly hooked into several computers, but they are far from mainstream.
Most memory in portable devices will be electronic.
Right on with this one. Electronic memory is ubiquitous in portable devices. I haven’t heard of a new portable device with a hard drive in several years.
Most portable devices won’t have keyboards
It is difficult to tell if Kurzweil meant that the concept of the keyboard would be gone or just physical keyboards. If he meant physical keyboards, he is well on his way to being correct. If he meant the whole concept of a keyboard he is very wrong. Judging from his other predictions on voice recognition I am betting he meant the latter.
Privacy concerns will prevent people from storing data in the cloud.
This one is a mixed-bag. Privacy concerns definitely loom, but most people are oblivious to them. It will take a large security breach at a company like Google or Facebook to change this.
Cables are disappearing in favor of short-distance wireless for components, such as monitors, printers, keyboard, etc.
They are disappearing for those who want them too. Cables are still cheaper though and most do not care. With that being said, wireless connections for components are very common. This one was pretty easy though as the trend was beginning back in 1999.
Most text is created with continuous speech recognition, which is more accurate than human transcriptionists.
Very wrong. Most text is still typed into a keyboard as I am doing now. Some new services are coming online though, such as Google Voice’s voice transcription. Quality is still spotty at best, but should continue to improve. Far from being as accurate as a human transcriptionist though.
Advances in displays will bring higher resolution, higher contrast, larger viewing angle, and no flicker.
Advances here have been constant, with all of the above coming true. The picture quality of a modern HDTV far exceeds that of what was available for a reasonable price in 1999. He also predicted a rise of what we now call e-readers to take advantage of these advances.
Computer displays in eyeglasses.
These are an active area of research and prototyping, but I know of nothing that has made it to the consumer market yet. There seem to be some high end products out there (here for example) though.
Three-dimensional chips will be commonly used.
3-D chips are still in early research stages, with most estimates placing them at least a decade out. The most recent story I have seen is here.
Trillion calculations/second for $1000 PC
Not there yet. Wikipedia gives a speed of 76 billion calculations/second for an Intel Core i7 Extreme 965EE, which costs about $1000 currently. That doesn’t include the rest of the computer to go around it either.
Posted by Brian
Blood: An Epic History of Medicine and Commerce
By Douglass Starr
This was the last of my wife’s popular medical books that she picked up at the library book sale this summer. I ended up reading all of them before she read any. Chronicling the development of blood transfusions from ancient times through today, Blood tells an interesting story. Early transfusion methods and ideas look crude and odd in modern times. For example, early experiments involved transfusing blood from farm animals into patients in an attempt to change their personalities. Needless to say, this didn’t work.
The primary focus in Blood is on two subjects. First, Starr details the rise of the blood industry, mostly beginning with work done during World War II. This was focused on battlefield transfusions and led to the development of technology to separate plasma and to handle blood and its derivative products in bulk. From here the industry exploded, with a number of new blood derivatives entering the market, stretching the number of people who could be treated for each unit of blood given. These developments also led to blood products being combined in large vats of thousands of units, each from a different person, which led to the focus of the rest of the book.
Today the safety of blood transfusions is excellent. It is very rare for anybody to get sick from a blood transfusion, but that wasn’t the case until the 90’s. Before this tens of thousands of Americans contracted hepatitis from transfusions, many because of the large vats used to make derivative products. If one unit of the tens of thousands contains hepatitis the whole vat will be contaminated. This was a known issue for the industry, but there were no tests initially. When tests became available the industry deemed it not cost effective to test every unit. The turning point came when AIDS came on the scene. The clotting that hemophiliacs use is derived from blood products that are pooled into tens of thousands of units. By the 1980s standard hemophiliac treatment involved many injections on a regular basis of this clotting factor, which led to as many as fifty percent of hemophiliacs in some areas contracting AIDS. To compound the problem the industry refused to admit their was a problem, leading to many lawsuits. The details of the industry’s indecision were exposed in court.
Blood leaves the reader with a bad taste in their mouth. Many in the blood industry turned their back on those who were being treated by their products. They either refused to admit their was a problem or decided they couldn’t do anything about it. Many knew that it would effect the bottom line and chose to do nothing. If you are interested in how the blood industry developed, then Blood is an excellent read.
Posted by Brian
The Productive Programmer
By Neal Ford
One of my biggest pet peeves is seeing programmers not learn how to use their tools. How you can use the same tools year-after-year and not even take the time to learn the hot keys for them strikes me as stupid. With this in mind (and with the recommendation of a co-worker) I read (his copy of) The Productive Programmer.
One of Ford’s points in The Productive Programmer is that computers excel at automating repetitive tasks, yet many programmers spend an absurd amount of time doing just that. Our skill set gives us the unique ability to use computers in a way drastically different than most people, yet many programmers never break the old habits they gained before becoming programmers. He also promotes the use of tools to enhance speed, such as an application launcher, and the use of macros and hot keys to speed up your use of applications you use everyday. He promotes the use of virtual desktops and multiple monitors to enhance focus.
This book inspired me to refine many things I was already doing. On Windows I am making more use of Launchy than ever for various tasks. On Linux I use Gnome-Do more than ever. I have four virtual desktops set up on Windows with Dexpot and four on Linux with the built-in functionality of Gnome. It reinforced several programming techniques I already follow, such as composed methods, and inspired me to look into using dynamic scripting languages more for everyday tasks.
With its heavy emphasis on Java and scripting languages the specific examples here won’t apply to every reader, but that isn’t what is important. The Productive Programmer reads more like a philosophy book, inspiring the reader to interact with their computer in a new, more productive way. A must read.
Posted by Brian
The Antibiotic Paradox: How the Misuse of Antibiotics Destroys Their Curative Powers
By Stuart B. Levy, M.D.
When penicillin was first made available to the public it was touted in the press as a miracle drug that would end disease, leading to it being used to treat many illnesses that it had no real power to cure. Even today many still go to the doctor demanding an antibiotic to treat a cold because they do not understand that antibiotics only treat bacterial diseases, not viral. Stuart Levy explores this misuse of antibiotics in The Antibiotic Paradox.
So what is the paradox exactly? The problem is that the more you use antibiotics the more useless they become. An antibiotic does not kill every bacteria it comes in contact with. Those that are left usually have some form of resistance that can be transferred to other bacteria. In effect the use of an antibiotic selects for ever greater resistance to that antibiotic. It gets worse though. Even though we have approximately 100 antibiotics to choose from many of them have similar enough chemical compositions that resistance to one will also confer some level of resistance to another.
This paradox is a problem even with responsible use of antibiotics and Levy makes clear that proper use of antibiotics should not be stopped. His problem is with unnecessary and incorrect use of antibiotics. The patient who demands antibiotics unnecessarily can cause resistance that effects all. The person who pops a few pills when he feels a little run down is selecting for resistance in his own body for no reason. The overuse of antibiotics in livestock as a growth promoter limits the types that can be used in people because resistance has already been selected for. The explosion in unnecessary usage of antibiotic products in the home paradoxically can make your home less safe.
Levy has crafted an excellent exploration of the consequences of the abuse of antibiotics in society. As the only drugs whose abuse can actually cause more disease in society, he pushes for regulation that splits them into their own class of drugs with regulation that recognizes the unique role they play. Along with Infection: The Uninvited Universe, give The Antibiotic Paradox to a hypochondriac friend near you.
Posted by Brian
By Stephen King
My brother is a big Stephen King fan so when I was out of reading material on a trip I grabbed The Gunslinger, the first book in King’s Dark Tower series, from him. His version is the revised and expanded edition, which contains 35 more pages than the original and some changes to be more consistent with the plot of the later books. King started this series without knowing where he wanted to go with it and mistakes were made.
Explanations as to what the hell is going on are slim. The book jumps right in with the protagonist, Roland, chasing after the man in black for unknown reasons. The next 300 pages start to fill in the details about who Roland is and why he is chasing the man in black, although the full reasons for the latter are still unclear. Along the way Roland picks up a child, who ends up getting caught up in his obsessive pursuit.
King was young when he wrote this and it shows in parts. The writing can be rough with a disconnected feel, although this may be just frustration from not knowing much of what is going on. The Gunslinger proves that it is possible to write a 300 page book that feels like a prologue. You finish feeling like nothing much has happened. After finishing this one my brother told me not to read the next without reading some other King’s book. Apparently the rest of the Dark Tower series ties in with his other novels meaning I will most likely embark on a massive Stephen King project next year.
Posted by Brian
Infection: The Uninvited Universe
By Gerald N. Callahan
Absolutely fantastic. This is one the few non-fiction books that I have read this year that I would recommend to anybody. In Infection, Gerald Callahan describes how infection is what makes us human. From the helpful bacterial flora in our intestines to disease changing the course of history, infection shapes us and the world we live in.
The book is divided into three parts. The first covers good germs and is the most fascinating portion of the book. Starting from before birth, where bacteria in the birth canal (Lactobacilli) help prevent premature delivery. Proteins from breast milk serve as fertilizer for bacteria to populate the child’s intestines, an essential part of health. From here we learn about parasitic worms being used to treat diseases and the possible role of bacteria in mental health. The big point is that bacteria are not always the enemy and that extreme cleanliness can actually be to the detriment of your health. The last two sections of the book cover the role of bacteria in shaping the world as we know it and some diseases that could alter the course of history. These sections are also interesting, but not as much as the first.
Callahan’s writing is excellent. He mixes in stories from his own family and how bacteria shaped them. He strikes a nice balance between medical and popular that will appeal to a wide audience while still being informative. I highly recommend picking this up. It will help you rethink how you view the bacteria surrounding us.