David Baker's lab at the University of Washington has been working on several approaches to protein structure problems. I mentioned Rosetta@home here, and now the team has published an interesting paper on another one of their efforts, FoldIt.
That one, instead of being a large-scale passive computation effort, is more of an active process - in fact, it's active enough that it's designed as a game:
We hypothesized that human spatial reasoning could improve both the sampling of conformational space and the determination of when to pursue suboptimal conformations if the stochastic elements of the search were replaced with human decision making while retaining the deterministic Rosetta algorithms as user tools. We developed a multiplayer online game, Foldit, with the goal of producing accurate protein structure models through gameplay. Improperly folded protein conformations are posted online as puzzles for a fixed amount of time, during which players interactively reshape them in the direction they believe will lead to the highest score (the negative of the Rosetta energy). The player’s current status is shown, along with a leader board of other players, and groups of players working together, competing in the same puzzle.
So how's it working out? Pretty well, actually. It turns out that human players are willing to do more extensive rearrangements to the protein chains in the quest for lower energies than computational algorithms are. They're also better at evaluating which positions to start from. Both of these remind me of the differences between human chess play and machine play, as I understand them, and probably for quite similar reasons. Baker's team is trying to adapt the automated software to use some of the human-style approaches, when feasible.
There are several dozen participants who clearly seem to have done better in finding low-energy structures than the rest of the crowd. Interestingly, they're mostly not employed in the field, with "Business/Financial/Legal" making up the largest self-declared group in a wide range of fairly even-distributed categories. Compared to the "everyone who's played" set, the biggest difference is that there are far fewer students in the high-end group, proportionally. That group of better problem solvers also tends to be slightly more female (although both groups are still mostly men), definitely older (that loss of students again), and less well-stocked with college graduates and PhDs. Make of that what you will.
Their conclusion is worth thinking about, too:
The solution of challenging structure prediction problems by Foldit players demonstrates the considerable potential of a hybrid human–computer optimization framework in the form of a massively multiplayer game. The approach should be readily extendable to related problems, such as protein design and other scientific domains where human three-dimensional structural problem solving can be used. Our results indicate that scientific advancement is possible if even a small fraction of the energy that goes into playing computer games can be channelled into scientific discovery.
That's crossed my mind, too. In my more pessimistic moments, I've imagined the human race gradually entertaining itself to death, or at least to stasis, as our options for doing so become more and more compelling. (Reading Infinite Jest a few years ago probably exacerbated such thinking). Perhaps this is one way out of that problem. I'm not sure that it's possible to make a game compelling enough when it's hooked up to some sort of useful gear train, but it's worth a try.