Johns Hopkins Magazine
Johns Hopkins Magazine Current Issue Past Issues Search Get In Touch
   
Wholly Hopkins
 
Matters of note from around Johns Hopkins


Earth Sciences: Too warm for ozone repair?

University: Stimulus may boost Hopkins

Nursing: Economic stress can kill

Research: Flight of the pterosaurs

APL: Headbanging for a better arm

International Studies: Paulson now a SAIS fellow

Books: The sacred narratives of war

Peabody: A very cool — and very cold — inaugural gig

Astrophysics: Pondering deep, deep math

Sports: Twilight for crew

Proteomics: Mapping proteins on a tight budget


Too warm for ozone repair?

Spurred by reports from scientists who warned that Earth's ozone layer had been extensively damaged — a phenomenon that received worldwide attention when scientists detected a continent-sized ozone hole over Antarctica in 1985 — governments around the world made a pact 22 years ago to ban the production of gases responsible for it. But in a recent article in Geophysical Research Letters, scientists at Johns Hopkins and NASA say that the continuing ban on ozone-depleting chlorofluorocarbons (CFCs) used in refrigerants and some aerosol spray cans might not be enough to quickly reverse the damage they have inflicted on the atmosphere.

Photo by istockphoto.com While levels of CFCs and related airborne compounds have decreased since peaking in the 1990s, an even more familiar offender — planet-warming greenhouse gases — might be preventing the ozone in some regions from returning, researchers say. Their finding is especially significant for parts of the Southern Hemisphere, where the recovery rate of ozone is at its lowest. A gas that is present naturally in the atmosphere, ozone protects living things on earth from harmful levels of the sun's ultraviolet rays, which cause skin cancer and threaten several species of plants.

Ozone is naturally created and destroyed around the planet all the time. But greenhouse gases are changing some of the systems in the atmosphere that typically encourage the ozone layer to rebuild itself. Using NASA simulations of the atmosphere that kept the amount of ozone-depleting substances constant to observe the effects of greenhouse gases, researchers determined that depleted ozone is returning at different rates in different parts of the lower stratosphere, the atmospheric layer 12 to 25 kilometers above the earth where 90 percent of ozone resides. An excess of carbon dioxide — the most prominent greenhouse gas — causes the stratosphere to cool, which actually helps form ozone. But chemical reactions spurred on by a warming climate counteract that cooling in the stratosphere's lower levels, and an increase in the upward movement of warm air in the southern tropics brings in more ozone-poor air from the troposphere below. The result is that the ozone layer takes longer to repair itself.

"The take-home message from this is you can't necessarily expect the declining rate of CFCs to lead to ozone repair because of the changes that have come from global warming," says Darryn Waugh, a professor of earth and planetary sciences at the Krieger School of Arts and Sciences. Even as CFCs wane, greenhouse gases could threaten the development of ozone in the southern tropics for decades, at least. Carbon dioxide molecules have a lifetime of more than 100 years and "tend to distribute themselves very evenly around the world," Waugh says. "It doesn't matter where they're produced. They find ways to hang around."

Why aren't northern latitudes similarly affected by the phenomenon? Waugh says the change in air movement in the lower stratosphere is much smaller in the Northern Hemisphere, and there is still enough cooling in the upper stratosphere for the ozone layer to recover earlier from the damage done by CFCs.

"In the big picture, it'll take 50 to 60 years for the ozone to return in most places because carbon dioxide, like CFCs, has a long lifetime." The study's results could have implications for the health of the planet and some of the people who inhabit areas where the ozone will be slowest to return. Fair-skinned people in countries such as Argentina, Australia, and New Zealand (Waugh's native country) could be affected by years of more exposure to skin cancer-causing levels of ultraviolet rays. Even though scientists didn't attempt to forecast an increase in the amount of UV radiation in the region, "I wanted policymakers to know that it isn't realistic to assume that all of the ozone will come back at once," Waugh says.

The researcher adds that there are few further actions nations can take to change the situation in the stratosphere. "We've basically done all we can do," Waugh says. "In the big picture, it'll take 50 to 60 years for the ozone to return in most places because carbon dioxide, like CFCs, has a long lifetime. The delay in the Southern Hemisphere might not turn out to be huge — maybe 20 years more — because of the different dynamics, but it could be significant."

In the future, Waugh and his NASA collaborators will look to see if depleted ozone contributes to global warming by allowing in more earth-warming rays of the sun. "Our focus now is on seeing how changes in the stratosphere affect climate, as well as looking at air quality to see how it is affected by greenhouse gases," he says. "We want to look at the whole cycle." — Michael Anft


Stimulus may boost Hopkins

When Barack Obama signed the American Recovery and Reinvestment Act (ARRA) on February 17, the $787 billion economic stimulus package included several provisions that should benefit Johns Hopkins. Principal among them is $10.4 billion for the National Institutes of Health. Of the NIH funds, $8.2 billion will go to support scientific research, and of that amount $7.4 billion will be distributed to the various NIH institutes and centers to fund research grants. A February 23 statement from NIH acting director Raynard S. Kington enumerated priorities for the ARRA spending. Much of the money will go to grants already in the pipeline — proposals that have been reviewed but not yet funded. Additional funds will be allocated for targeted supplements to current grants, and to new applications "that have a reasonable expectation of making progress in two years." As Johns Hopkins Magazine went to press, there was no indication of how many proposals from Hopkins might receive funding from the stimulus dollars, but the institution usually competes well in securing NIH support. Federal agencies were in the process of reviewing, on an expedited basis, requests for ARRA funds that would be dispersed throughout the summer and fall.

In addition to the ARRA money, at press time the NIH was poised to receive $30.3 billion upon passage of the Fiscal Year 2009 Omnibus Appropriations Bill, a $938 million increase over fiscal year 2008. Other ARRA provisions that could benefit Hopkins include $3 billion for the National Science Foundation, $1.6 billion for the Department of Energy Office of Science, $1 billion for NASA, and $87 billion to the states for Medicaid programs, as well as increased funds for student aid and research infrastructure improvements. — Dale Keiger


Economic stress can kill

That financial woes lead to stress surprises no one. But new research out of the School of Nursing has found that for older women, money problems can be deadly. Data from two longitudinal studies show that older women who reported that they could not make ends meet died sooner than those who reported having sufficient money. Financial strain may be an even better predictor of mortality than actual income. "Twenty thousand dollars for someone might be enough, and for someone else it might not," says Sarah Szanton, an associate professor in Nursing and the study's first author. "Income itself doesn't tell you about the balance between income and need."

Photo by istockphoto.com Szanton and her co-authors looked at the impact of "financial strain" on mortality in 728 women in their 70s living in and around Baltimore. The researchers mined data collected in the 1990s by the Hopkins-led Women's Health and Aging Studies I and II. In the WHAS research, each woman was asked, "At the end of the month, do you have some money left over, just enough, or not enough?" Szanton and her colleagues compared data from the women's responses to mortality data for a five-year period during which 117 of the women died. They found that women who had reported financial strain were nearly 60 percent more likely to die within five years than those who didn't. Not only were women who responded "not enough" more likely to die than those who responded "just enough," women who responded "just enough" were more likely to die than those who had "some money" at the end of the month. The finding held regardless of age, education, income, chronic disease, health insurance status, or race, although the association between financial strain and mortality was two and a half times higher among black women.

Kenneth Ferraro, a sociology professor at Purdue University and editor of the Journal of Gerontology, where the paper appeared last December, says the stronger link between financial strain and mortality among black women could be explained by the duration of the stress. "My hunch is that African-American women would be more likely to say they have had [financial strain] for a longer period of time," he says. The study suggests that over the course of their lives African-American women likely had more cumulative health disadvantages than white women, which may have taken a greater physiological toll. The authors also point out that due to residential segregation, African-American women tend to live in neighborhoods with lower-quality housing and less access to services and resources.

The researchers have yet to explain the association between financial strain and mortality, but Szanton sees two possibilities. The stress itself might cause detrimental physiological changes. Or, she says, it might influence how people behave, causing them to undereat, or cut dosages of their medications to make them last longer. "But you can't conclude either one of those based on this article," she says. "It's the subject of future research." —Cassandra Willyard, A&S '07 (MA)


Flight of the pterosaurs

Sixty-five million years ago, giant pterosaurs soared over the landscape of the Late Cretaceous period. Their narrow heads were five feet long, their wings were up to 35 feet across, and thin, fuzzy, hairlike scales ran down their backs. Their wings doubled as forelimbs, so they could walk on all fours with their shoulders 10 feet off the ground. They were the size of adult modern giraffes, albeit giraffes with beaks, and looking at them prompts an obvious question: How did something so big ever launch itself into the air?

Pterosaurs flew, despite being the size of adult giraffes.
Illustration by Mark Witton
Some scientists have hypothesized that pterosaurs dropped from cliffs to get airborne, like extreme sports enthusiasts. Others have imagined the creatures launching themselves with their back legs, like modern birds. Both ideas are problematic. Cliffs would not always have been conveniently nearby, and pterosaurs were not built like birds, which have relatively thick thigh bones that permit running or leaping into the air. Michael Habib believes he may have solved the problem. He thinks pterosaurs vaulted into the air using their wings as powerful forelimbs.

Habib, a doctoral candidate at the School of Medicine's Center for Functional Anatomy and Evolution, ordinarily concentrates on the mechanics of bird flight. But about two years ago, friends asked him how big a flying animal could get. So he started studying the pterosaurs, the largest animals ever to fly. One night, watching a nature video that showed vampire bats crawling around on cattle by using their wings as front legs, Habib was reminded of pterosaurs because they walked in the same manner. The bats take off by crouching to build up energy in their tendons and then, with a Herculean push-up, vaulting hind feet first into the air, their front limbs propelling them to flight. If pterosaurs walked like vampire bats, Habib wondered, could they also take off like vampire bats?

He took bone measurements from pterosaurs, plugged the data into equations that detailed bones' ability to withstand force, and determined that their skeletons could support such a vault. The pterosaurs' shoulders already stood 10 feet in the air, and Habib thinks the vault could have propelled them up another 10 feet, giving them more than enough room and speed for the first mighty flap of their wings. He has calculated that they could go from 0 to 33 miles per hour in 0.6 seconds. They may have had incentive for getting airborne quickly. After all, they lived alongside Tyrannosaurus rex. "T-Rex is a really good reason to take off in less than a second," says Habib.

Habib published his findings in a special, pterosaur-themed edition of the journal Zitteliana in December 2008. He admits there are problems with his model. Fossil footprints confirm that pterosaurs walked on all fours, but Habib concedes no launch footprints that would support his hypothesis have been recognized as yet. Dave Burnham, a paleontologist at the University of Kansas, recalls a set of pterosaur tracks recovered in France that appear to support a two-legged running launch, but says these could also be the fossil record of a pterosaur's landing. Another problem is that giant pterosaurs spent a good deal of time flying over the ocean. But they couldn't have taken off from the water by vaulting. So if one ever splashed down, how did it get back in the air? Habib says he's working on this problem. — Emily Laut, A&S '09 (MA)


Headbanging for a better arm

When two young Johns Hopkins researchers sought a better way to train amputees to use prosthetic arms and hands, they faced familiar obstacles: How do you convert signals from the patient's remaining nerves into messages that move each mechanized finger immediately and with dexterity? Once you have figured that out, how do you teach patients the skill?

Iraq vet Jonathan Kuniholm trains with Guitar Hero.

Last summer, Robert Armiger, Eng '06 (MS), a biomedical engineer at the Applied Physics Laboratory (APL), and Jacob Vogelstein, Eng '07 (PhD), a senior staff member at APL and an assistant professor of electrical and computer engineering at the Whiting School of Engineering, found what they believe to be an answer: Guitar Hero, the mega-selling video game on the Nintendo Wii platform. In Guitar Hero, you "play" guitar along with computerized music videos, fretting the notes by pushing buttons with your left hand on the neck of a guitar-shaped controller, and strumming a second button with your right hand. You earn game points by playing the correct note at the correct time, so it requires speed and accuracy.

Armiger and Vogelstein used a soldering gun to rejigger the game's controller so it could be played virtually on a computer monitor by the equivalent of one hand. The two engineers adapted complex algorithms that convert electrical signals from nerves in the arm into computer control signals that correspond to individual movements of the hand. Then, they used hand-movement control signals to simulate pressing the game controller's buttons. Once the system was tested, a retired U.S. Marine Corps captain who is also a biomedical engineer was able to attach the electrodes and begin training the system to interpret muscle signals.

Armiger and Vogelstein worked with the captain, Jonathan Kuniholm, who had lost his hand and most of his forearm to a homemade bomb in Iraq, to "play" along with Pat Benatar's top-10 song from 1980, "Hit Me With Your Best Shot" (chosen because it was short and at the game's basic level, not because of its name). Kuniholm used electrical signals sent from his brain to the remaining nerves in his forearm to "push" the rewired controller's buttons. By the end of a few training sessions, the captain could nail about 80 percent of the notes, proving he was able to convert the signals from his forearm muscle contractions to commands that will eventually activate his prosthetic fingers. Armiger and Vogelstein knew they were on to something.

APL researchers have been working on developing a dexterous mechanized hand for several years, but have yet to develop accessible training systems for those who will one day use them. "We saw that there was a disconnect between what that hand will be and plans to get it into clinical practice," says Vogelstein. "People need to understand how the computer interprets muscle signals to learn to use these hands in a dexterous way." Current virtual reality training tools are repetitious, tedious, and tiring. "We had been looking for a way to make the algorithm training process better and less exhausting for the amputees," says Armiger. First he had tried the seminal 1970s video game Pong, but the game's glacial pace and limited number of movements required to play made it less than compelling. "I thought I'd be wasting my time coding that. We needed something more interesting."

The success of their new Guitar Hero model has caught the interest of leaders at "Revolutionizing Prosthetics 2009," a 30-institution project led by APL and financed by the federal Defense Advanced Research Projects Agency (DARPA). The project aims to revolutionize mechanical arms that will be controlled by neural signals, and make them practical by the end of this year. "They just ordered us our second Wii," Vogelstein says. "So, that's a good sign."

In November, when Armiger and Kuniholm met at Walter Reed Army Hospital, in Washington, soldiers recovering from their injuries expressed excitement at the prospect of being able to play video games. "The demographic of the amputees is primarily young people who have been in the armed forces. Many of them have played Guitar Hero before and would like to do it again," says Vogelstein. He and Armiger say they'd like to convert other Wii games that require precise movements, such as bowling and tennis, to therapeutic tools for amputees. They will continue to use Guitar Hero as part of their work, but that's not all they're doing with the game. The 28-year-old Armiger (Vogelstein is 31) says, "We made sure that when we modified the controller that we could still play it in the usual way. It's one of those things that gets kind of addictive." — MA


Paulson now a SAIS fellow

The January inauguration of Barack Obama marked the end of Henry M. Paulson Jr.'s tenure as U.S. secretary of the treasury. It did not take him long to find another position. Within a week, the Nitze School of Advanced International Studies announced that Paulson was joining SAIS as a distinguished visiting scholar. He is now a fellow at SAIS' Bernard Schwartz Forum on Constructive Capitalism. He brings to the forum expertise not only in global finance but in China, which he has visited on more than 70 occasions, according to The New York Times.

Photo by AP Photo /
Gerald Herbert
SAIS Dean Jessica Einhorn has described Paulson as an ideal visiting scholar because of his extensive experience in both the private and public sectors. She notes his long engagement with areas critical to contemporary international affairs.

Paulson was confirmed as the 74th treasury secretary on June 28, 2006. On that day, the Dow Jones industrial average closed at 10,924.74, down about 1 percent due to anxiety over interest rates. Little did anyone know that the anxiety was just beginning. Paulson eventually faced the unenviable task of guiding Treasury through one of the biggest economic rockslides in U.S. history, as several major financial institutions, including Lehman Brothers, Bear Stearns, Fannie Mae, Washington Mutual, and Freddie Mac collapsed. Paulson endured considerable criticism for his handling of the crises.

His government service began in 1970, with a stint at the Department of Defense. In 1974, Paulson joined the Chicago office of the Goldman Sachs Group. Twenty years later, he ascended to chief operating officer, and he became chief executive officer and chairman of the board in 1998. A conservationist — he stood out in the Bush administration for his views on the urgency of dealing with global warming — Paulson also served as chairman of the board of the Nature Conservancy from 2004 until his appointment as treasury secretary.

In December 2008, Time magazine named Paulson one of four runners-up for "Person of the Year." — DK


The sacred narratives of war

The 1862 Battle of Fredericksburg was among the most one-sided engagements of the Civil War. Confederate troops dug in atop the hills overlooking the Virginia town and mowed down attacking Union soldiers by the thousands. But despite assault after failed assault, Union soldiers kept charging toward certain death with, witnesses said, an expression of "seriousness and dread." Afterward, one soldier explained: "I was not only unafraid to die, but death seemed to me a welcome messenger. ... The New Jerusalem seemed to rise before me."

In his new book, Fighting Identity: Sacred War and World Change (Praeger Security International, 2009), defense analyst Michael Vlahos, a fellow at the Applied Physics Laboratory, argues that such battles became important chapters in the story the nation tells itself, its "sacred narrative." Part of the American sacred narrative is that the United States fights for freedom and democracy; the proof is in heroic Civil War tales like Fredericksburg. He calls the soldiers at Fredericksburg the "American Mujahideen." They died in incredible numbers because they believed in a cause, and that cause's promise of transcendence. Modern jihadist warriors in Iraq or Afghanistan, says Vlahos, are willing to die for a similar promise.

He argues that war is one of the chief ways that societies form their identities. Contemporary "non-state agents" that the United States treats as criminal forces — from Hezbollah to the Taliban to American urban gangs like MS-13 — now are creating their own identities through battle. These so-called "radical" groups have emerged in such numbers, Vlahos says, because the world is in the late stages of a globalization that has destroyed old identities. For instance, many parts of the Middle East, Asia, and Africa now operate under a "brittle crust of neo-Western governance," which has done nothing to replace the sense of identity that cultures there had prior to colonization. So new identities are naturally emerging, forged in part by war.

Photo by istockphoto.com

Vlahos bases much of his argument on analogies between the current time and two prior "globalizations," those of Late Antiquity (roughly A.D. 300 to 700) and the High Middle Ages (1100 to 1450). For example, he sees parallels between the United States and ancient Rome: "If you look at the Roman Republic around 100 B.C., for instance, it's tiny but everybody's ready to put on his battle harness and go out and fight. Three centuries later Rome needs a professional army run by the state to patch all the holes in the empire." In the book, Vlahos describes how the Roman Empire dealt with communities, like the Christians, that were resistant to Rome's ideological authority, by treating them as a threat. This only strengthened those communities' identities and eventually cost Rome its supremacy. The United States is taking the same approach in places like the Middle East. "We always reflexively reach for the bully approach," says Vlahos. "But world change is going to happen. Half of the world has been left behind by globalization, and they have to find their own way."

While Vlahos' analysis is dense with allusions to ancient military history and the sociology of war, his prescription is simple: The United States must step back and let the world evolve. "America's greatest strategic vulnerability is our inability to see things the way they are," Vlahos says. "We've chosen the wrong side of history, the status quo. History is about change." Instead of fighting insurgent groups — which only legitimizes them and strengthens their identity, says Vlahos — -the United States should disengage from battles like those in Afghanistan and Iraq and develop relationships with some groups now considered enemies. Americans, he says, must change their paradigm, their national story. "We need to give up our vision of America as the necessary empire, redeemer to the world," he says. "I'd like to see us move toward a direction I call 'American altruism,' so we're actually in a position to support, save, rescue." — Andrea Appleton


A very cool — and very cold — inaugural gig

Peabody faculty member Anthony McGill had a very good seat at the inauguration of President Barack Obama — on stage, performing with Yo-Yo Ma, Itzhak Perlman, and Gabriela Montero. The ensemble played a new composition, "Air and Simple Gifts" by John Williams, as part of the January inauguration ceremony. McGill described the experience to the Chicago Tribune: "It was freezing, let me tell you."



Pondering deep, deep math

Do not open Mario Livio's new book looking for answers to big questions. In Is God a Mathematician? (Simon and Schuster, 2009), the two major questions that the Johns Hopkins astrophysicist examines — why mathematics is so powerful in explaining the universe, and whether mathematics is an invention or a discovery — matter far more than the answers, he says. We may never be able to answer fully these questions, or the one suggested by his book's title — how the universe began — but he says a scientist's work is to keep asking, exploring in part the uncanny ability of mathematics to explain phenomena large and small, ancient and modern.

Take Livio's "day job," as he calls it, at the Space Telescope Science Institute (he is also an adjunct faculty member in the Krieger School's Department of Physics and Astronomy). Livio uses mathematical formulas to try to explain or predict events in the universe — a supernova explo-sion, for example. An equation he might use to explain the movement of star clusters almost precisely explains the motion of subatomic particles in suspension, fluctuations in stock option pricing on Wall Street, and the motion of smoke particles suspended in air. Yet when the equation was created, most of those latter uses had not yet been imagined. Livio offers readers many such examples of mathematics' power to explain the world, adopting Nobel laureate Eugene Wigner's expression "the unreasonable effectiveness of mathematics." Early findings such as Newton's law of gravity and the abstract knot theory, developed with relatively primitive means, have proven their relevance over centuries. Newton's law has been found to be precise down to less than one part per million. Knot theory has applications in explaining the subatomic world.

The second question his new book raises — Did humans invent mathematics or discover it? — is actually ill posed, Livio says. "Mathematics is an intricate combination of both invention and discovery," he says. "For example, Greeks invented the concept of axioms of geometry. They did not discover it. But once the concept of axioms was invented, theorems were discovered, like the Pythagorean theorem." The profundity, Livio says, lies not in the either/or question but in exploring the balance: How much is mathematics a system imposed by humans to order and explain the universe, and how much is it a great inherent truth that we've been working for centuries to understand and prove?

Livio sees his writing as an extension of his work at the institute, where he is director of the Office of Public Outreach. Is God a Mathematician? follows three recent books on math and science that were also written for a general ("but educated," he notes) audience: The Accelerating Universe (2000), The Golden Ratio (2002), and The Equation That Couldn't Be Solved (2005). "Passion is very important to communicate," Livio says. "What we want to achieve in science education is to have young people become passionate about the subject, and to see science as an integral piece of human culture. Even a poet should know that there are laws of nature, without which we couldn't begin to understand why things happen in the natural world."

Photo by John Bedke And the God question itself? Livio, born a Romanian Jew, doesn't hesitate to answer. "I'm not really a religious person, but I have true respect for all religions," he says. "As a scientist who deals with questions about how the universe started, people ask me all the time about my beliefs. I feel that people who believe in God and have a strong need for God don't really want just a Creator God, who created the universe then left it to its own devices. They want a God who is there for them every day, every second. And science has absolutely nothing to say about a God like this, a God who guides all their moral and ethical thinking."

Livio adds, "To describe the physical phenomena that began the universe, you can use science. I find it absolutely outrageous when people try to take [the story of creation in] Genesis literally. That story is a beautiful description that expresses awe for the inexplicable, but it was never meant to be scientific theory."

He suspects that his work will continue to trend toward the reflective as he steps back from his telescopes a bit to consider the big picture. "It may have a little to do with my age," he says, "I tend to become a little more philosophical these days." Unlike mathematics itself, which is fairly black and white, the philosophy of mathematics has much gray area, an attribute the astrophysicist savors. "Just because mathematics is precise," he says, "doesn't mean it's not complex." — Lisa Watts


Twilight for crew

Photo by Getty Images / Frank and Helena

On February 12, Johns Hopkins athletic director Tom Calder announced discontinuation of men's and women's varsity crew, effective at the end of the 2009 spring schedule. The university could no longer sustain the teams at a competitive level, according to the athletic department's statement. Four days after the announcement, an online petition and fundraising effort to save the program appeared at save.jhucrew.org. At press time, the effort reported commitments of $171,296 to continue the sport at the varsity level for at least five years.



Mapping proteins on a tight budget

Scientists spent 13 years and $3 billion to map all 25,000 genes of the human genome. If a Johns Hopkins scientist has his way, he and his staff will do the same for the human body's millions of proteins — and they'll do it, at least to start, for $350,000 a year.

Illustration by
Roert Neubecker
Just don't expect them to finish anytime soon.

The foray into the nascent field of proteomics is the brainchild of Akhilesh Pandey, an associate professor of genetic medicine and founder of the Institute of Bioinformatics, a nonprofit outpost he started seven years ago in Bangalore, India, to keep costs low. The Bangalore institute's 40 scientists aim to sequence the entire complex tangle of proteins responsible for growing tissue and sometimes heralding the development of disease, along with the genes that create and direct them. The institute has used some federal grant money and nominal licensing fees from commercial interests to publish an online database that is free for academic researchers — the Human Protein Reference Database (www.hprd.org). Pandey also encourages researchers to share protein-related information on a separate Web site he started (humanproteinpedia.org). He hopes both will become the definitive sources on the subject-a two-volume encyclopedia of sorts for scientists seeking ways to guide their experiments, even as other researchers fill in uncharted sections of the protein landscape.

"Our goal is to increase medicine's understanding of the building blocks of biology," says Pandey, whose wet lab at Hopkins also investigates proteins, including those instrumental in pancreatic cancer. "We're doing what 99.9 percent of researchers won't: start a nonprofit entity that aims to get to the bottom of complex and large systems." Dissatisfaction with research that lacks breadth is one thing that drives him, he says: "We're doing the opposite of what individual researchers tend to do, which is focus on smaller and smaller things so they can get funding. Sometimes, I think, people have lost sight of the goal of accelerating the pace of medicine."

There may be a reason why only a handful of other researchers worldwide are drawing up protein maps. Proteins are hard to pin down, differing in type and function by the organ or system in which they are found. They change makeup and location constantly. Each one might have as many as five different names; often, the names are long, to the point where searches of Google and other general databases frequently are useless. The mapping likely won't be finished in Pandey's lifetime, nor will it earn credit for him or his associates for isolating a new disease or finding a cure for an old one. Research grants are rare.

But by constructing a one-stop protein data shop for biomedical researchers, the institute can provide several valuable functions, Pandey says. It can help uncover biomarkers that often foretell the development of diseases, including cancer. It can also educate scientists as to when and where proteins are expressed, how they function and interrelate, and where they are located in the body.

The idea seems to have taken hold. The database and Web site receive 1 million hits per month from somewhere between 50,000 and 100,000 individual users — a strong sign that the institute fills a niche in the research world. Data on more than 20,000 human proteins have been collected on the site so far. "What we have is a bunch of loose parts, but they're necessary parts," Pandey says. Those parts come in from researchers around the world who have discovered bits of protein and their functions. The database gathers them all into one place. "You need people who are out doing the research to report their knowledge," he adds. Therein lies the rub. Researchers may have information on a protein that isn't particularly valuable to them, and so might not see the value in taking their time to post what they've learned about the protein's function and makeup. But if Pandey and his crew can find a way to persuade them, they might eventually be recalled as visionaries.

"The thing that drives me is the lateral thinking aspect of this," Pandey says. "We're trying to make a Google for proteins. Why aren't people trying to develop search engines for every aspect of life? Google has been around for 10 years, yet we really don't know what its ultimate potential is. People should just admit that: We're at the beginning of everything." — MA


Return to April 2009 Table of Contents

  The Johns Hopkins Magazine | 901 S. Bond St. | Suite 540 | Baltimore, MD 21231
Phone 443-287-9900 | Fax 443-287-9898 | E-mail jhmagazine@jhu.edu