Curious Bends – sugarcane cultivation, Ebola in India, the cost of sanitation and more

Curious Bends is a weekly newsletter science, tech. and data news from around South Asia. Akshat Rathi and I curate it. It costs nothing to sign up. If you’d like a sample, check out the one below.

1. A brief history of sugarcane cultivation in India

“Even earlier than Nehru, Professor C.V. Raman saw the spark in her and made her a Foundation Fellow of the Academy. Years later, in 1957, she was elected to INSA — the first woman scientist elected to any of the science academies in India. She was also awarded the Padma Shri in 1957. Having led a full life, she breathed her last on February 4, 1984. Think about it; every time you bite a sugarcane, or a lump of gud or vellam, you are enjoying the fruits of toil of Barber, Venkataraman and Janaki.” (5 min read)

2. How would India handle an Ebola outbreak?

“That’s why in Nigeria’s largest city Lagos, where the majority of the country’s 20 cases were discovered, authorities urged people not to urinate or defecate in drains, dump sites and open spaces. The move is perhaps one reason why Nigeria has successfully contained the epidemic, with no new cases since Sept. 8. In India, around 600 million people defecate in the open. A lack of toilets and in some parts a cultural preference for going outdoors would make it almost impossible for similar public health advice to have the same effect.” (8 min read)

3. Five reasons why India’s first mental health policy is impressive

“Desai believes it is commendable that the policy goes beyond treatment of mental illness to prevention and promotion of mental health, but hopes that the Action Plan keeps Indian cultural contexts in mind while implementing policies for prevention and promotion. “While talking about policies for treatment of mental ailments, there is reasonable uniformity in approach,” said Desai. “But when it comes to personality development and seeking happiness, the Action Plan must keep cultural aspects in mind.”” (4 min read)

4. What is the true cost of sanitation?

“This is just a snapshot of what it will take to achieve sanitation that delivers on the promise of public health and personal dignity that we as a society seek. Are we prepared to bear this true cost? Let’s just take the Rs.12,000 subsidy the government has promised those who will construct toilets. There are 111.10 million households that would need toilets. That totals up to Rs.1.34 trillion for the toilet construction alone. It becomes easier to choose when we look at the true cost of not providing safe sanitation to all. A study by the Water and Sanitation Programme and others has estimated this at 6.4% of GDP of India in 2006. Not included in this is the cost of wasting the fertilizer and soil regeneration value of the human waste of a billion people.” (6 min read)

5. Featured longread: A new statistical law that’s popping up everywhere for unknown reasons

“Systems of many interacting components — be they species, integers or subatomic particles — kept producing the same statistical curve, which had become known as the Tracy-Widom distribution. This puzzling curve seemed to be the complex cousin of the familiar bell curve, or Gaussian distribution, which represents the natural variation of independent random variables like the heights of students in a classroom or their test scores. Like the Gaussian, the Tracy-Widom distribution exhibits “universality,” a mysterious phenomenon in which diverse microscopic effects give rise to the same collective behavior. “The surprise is it’s as universal as it is,” said Tracy, a professor at the University of California, Davis.”” (12 min read)

Chart of the week

“When the price of black gold falls, businesses and individuals cheer but oil-exporting countries suffer. According to research from Deutsche Bank, seven of the 12 members of OPEC, an oil cartel, fail to balance their budgets when prices are below $100. Last month Venezuela, a particularly inefficient member of the cartel, saw its bonds downgraded. One non-OPEC member in particular is in trouble: Russia. Economic growth is already poor. Further drops in the oil price could be very painful. After all, oil and gas make up 70% of Russia’s exports and half of the federal budget.” The Economist has the full story.

9d632631-b256-4e88-8358-b491b1f3c3e9

Curious Bends – Hudhud, fewer cyclone deaths, population control and more

Curious Bends is a weekly newsletter science, tech. and data news from around South Asia. Akshat Rathi and I curate it. It costs nothing to sign up. If you’d like a sample, check out the one below.

India went from 15,000 cyclone deaths in 1999 to just 38 last year

“The difference in the number of fatalities between Cyclone Phailin and the Uttarakhand cloudburst is instructive here. Both storms happened last year, yet Uttarakhand left more than 5,700 dead and millions affected. Although Phailin would also affect millions, its casualty count was kept to double digits. A big part of this was simply that there was no advance warning about the Uttarakhand cloudburst, while the Met department and local authorities had been tracking Phailin for weeks.” (4 min read, scroll.in)

How supercyclone Hudhud got its name

“For years cyclones that originated in the north Indian ocean were anonymous affairs. One of the reasons, according to Dr M Mahapatra, who heads India’s cyclone warning centre, was that in an “ethnically diverse region we needed to be very careful and neutral in picking up the names so that it did not hurt the sentiments of people”. But finally in 2004 the countries clubbed together and agreed on their favourite names.” (3 min read, bbc.co.uk)

Central government officials’ attendance record is now public. Thanks to Ram Sewak Sharma.

“The website is a near-complete digital dashboard of employee attendance—it logs the entry and exit time, among other things. The entire system is searchable, down to the names of individual central government employees, and all the data is available for download. And with that single step—making the entire platform publicly accessible—the government has introduced a level of accountability and transparency that India’s sprawling bureaucracy is unaccustomed to.” (5 min read, qz.com)

Indian women pay the price for population control

“23-year-old Pushpa, narrates a similar tale of pain. The nurse at a public health facility inserted her with an IUD after she delivered her first child. Her consent was not sought. The procedure was done after getting the consent form signed by her husband, a daily wage labourer who had studied up to Class V. He wasn’t explained what an IUD is and what the form was for.” (13 min read, tehelka.com)

After 67 years of independence, India gets a mental healthy policy

“Dr Harsh Vardhan pointed out that earlier laws governing the mentally ill, the Indian Lunatic Asylum Act, 1858, and Indian Lunacy Act, 1912, ignored the human rights aspect and were concerned only with custodial issues. After Independence it took 31 years for India to attempt the first enactment, which resulted another nine years later in the Mental Health Act, 1987. But due to many defects in this Act, it never came into force in any of the states and union territories.” (3 min read, pub.nic.in)

Chart of the week

“The survey of 44 countries, a quarter of them in Asia, shows that economic optimism has followed economic growth: eastward. The continent with the highest proportion of respondents believing their children will be better-off than they are is Asia, with 58%.” (2 min read, economist.com)

ed8cc2c8-dacb-48aa-a388-0e0a3acbc5a6

Earth, unlike Venus and Mars, exhaled nitrogen

One element that forms a unique part of the life-friendly chemical environment on Earth is the gas nitrogen. Its cyclic movement through the soil and the atmosphere via plants is a crucial part of how they produce energy.

Earth’s atmosphere is 78% nitrogen and 21% oxygen; the remainder includes carbon dioxide, methane and noble gases like argon and neon. In stark contrast, the ratio of nitrogen to argon by volume is almost ten times lower in the atmospheres of Venus and Mars.

There are competing explanations for why Earth’s atmosphere is rich in nitrogen, including physical similarities between the three rocky bodies, their distances from the Sun, etc. Now, a new study conducted by geologists from two American labs gives one of those explanations an upper hand. Unfortunately, it could also make the search for alien life harder.

According to them, Earth’s tectonic activity has allowed the planet to steadily exhale nitrogen from its interior into the atmosphere. Their findings were published in the journal Nature Geoscience on October 19.

Such activity has “added about 85% more nitrogen to Earth’s atmosphere over the course of geological time,” said Sami Mikhail, a geophysicist at the Carnegie Institute of Washington and lead author of the published paper.

The uppermost layer of Earth, on which we live, is not a continuous surface but a jigsaw of slowly moving plates called tectonic plates. They often grind into, slide over or under each other, forming mountains and deep trenches as the case may be. When one plate rises and another dives beneath it, the region where they meet is called a subduction zone. Often, a section of Earth’s mantle gets wedged between the two plates (see image).

How subduction zones could promote nitrogen degassing.
How subduction zones could promote nitrogen degassing. Image: Sami Mikhail

Mikhail and Dimitri Sverjensky, of Johns Hopkins University, Maryland, together developed a model to understand how the mantle could pump out nitrogen into the atmosphere in a continuous process. They found that if ammonium sediments brought downward by the diving plate entered the mantle, they would react with oxygen to form nitrogen. While ammonium can get trapped in minerals, nitrogen can’t and escapes through vents in the tectonic plates and volcanoes into the atmosphere.

“Because subduction only happens on Earth, this has not happened on Mars and Venus. So the atmospheric composition of the three planets diverged once plate tectonics got going on Earth,” Mikhail said.

The duo also thinks the oxidation of ammonium in the mantle wedge would also have lead to the formation of more water, deposited on Earth’s surface.

The search for alien life – whether within the Solar System or on faraway exoplanets – has taken many forms. Astronomers aren’t looking for a fixed set of conditions but some minimum requirements for life. On Earth, these have been the presence of liquid water, periodically changing seasons and the chemical environments necessary for the formation of macromolecules like DNA, among others.

A nitrogen-rich atmosphere is an important part of such an environment. If Mikhail’s conclusions are true, the search for alien life becomes trickier because Earth is the only known planet with subduction zones.

“Maybe life would have survived for billions of years without subduction zones, but without subduction zones the atmosphere would be drastically different and therefore so would life,” Mikhail speculated.

In September 2014, another study published in Nature Geoscience found that Jupiter’s moon Europa also harbored tectonic activity. The announcement raised scientists’ hopes of finding life because Europa also has a subsurface ocean of liquid water. However, Europa’s surface is nothing like Earth’s, and it’s hard to say if subduction can do for Europa what it did for Earth.

Measuring loudness on Deepavali nights

The Indian festival of Deepavali gets its name from the Sanksrit for “display of lights”, “Deepaanaam aavali“. These days, the festival is anything but about lights, especially in urban centers where the bursting of loud firecrackers has replaced the gentler display of lamps. Sometimes, Bangalore – where I live – sounds like a warzone. People I’ve spoken to have defended the way they celebrated it, saying, “It’s a tradition thousands of years old!” No, it’s not.

The Noise Pollution (Regulation and Control) Rules 2000 (“Rules”) limits the amount of environmental noise by area and time of day. In residential areas, for example, the maximum allowable noise level between 6 am and 10 pm (‘daytime’) is 55 dB(A) Leq.

A Central Pollution Control Board document released on October 24 reports the results of an exercise where noise-level monitors listened in in five areas of the national capital, Delhi, for the week leading up to Deepavali: October 15 to October 23. Without exception, the dB(A) Leq readings in all five areas – Pragati Maidan, East Arjun Nagar, NSIT Dwarka, IHBAS Dilshad Garden and DCE Bawana – have increased from 2013 to 2014. The nighttime readings breach the Rules limits by at least 10 dB(A), which warrants a complaint.

Insofar as the Rules is concerned, the units of measurement play a defining role in how meaningful the limits are.

For starters, dB stands for decibels, a logarithmic measure of noise levels. According to ISO standards, a doubling of noise levels is equal to an increase of 3 dB.

Because noise levels during many kinds of measurements – including during Deepavali – keep changing, Leq is used because it denotes an average noise level during a specified period. Moreover, because dB is a logarithmic measure, Leq cannot be calculated like a simple average. Instead, sound-meters usually convert dB into the corresponding sound pressure levels and then calculate the average. In the process, the A-setting is also applied: it is a scale to measure the perceived human loudness.

As it is, the Rules don’t explicitly specify the time period across which the noise levels are to be measured. The only mention of periods, in fact, is when the document defines daytime (6 am to 10 pm) and nighttime (10 pm to 6 am). So the noise level of 55 dB(A) Leq is presumably defined for a 16-hour period (daytime; residential area). An obvious outcome of this is that infrequent loud noises in a generally quiet residential area will not breach the legal limits during daytime.

But what about during Deepavali? Let’s say the festival is being celebrated on a weekday: the bursting of firecrackers will start around 4 pm (once the kids have returned from school) and last until 9 pm. Could noise levels in this five-hour period push the daytime average beyond 55 dB(A) Leq?

I used the NoiseTube project’s mobile app (of the same name) that makes per-second measurements and calculates the minimum, maximum and average dB(A) Leq over a specified duration. Sitting about 80 m from a bunch of kids bursting firecrackers in our apartment driveway, I used the app to make 300 measurements over 5 minutes for the following results:

Min.: 41.51 dB(A) Leq
Max.: 83.88 dB(A)Leq
Avg.: 66.41 dB(A) Leq

Earlier in the day, I’d made a five-minute measurement when no firecrackers were being burst for an avg. reading of 42 dB(A) Leq. So, assuming that 42 dB(A) Leq was the reading for 11 hours and 66.41 dB(A) Leq for the remaining five, the daytime average reading comes to 56.53 dB(A) Leq. Abiding by the Rules, this isn’t even enough for me to register a complaint, which necessitates the noise levels to exceed the limit by at least 10 dB(A).

At the same time, the noise levels are debilitating. When the cracker-bursting frenzy is in full swing, I’ve recorded noises louder than 100 dB(A). If I spend a day outside, and if the sulfurous fumes don’t give me a migraine, just the noise will.

Because of this, the Rules might be more meaningful – and effective – if a measurement duration is defined, such as between certain times of day according to what time of the year it is (correct me if I’m wrong because I’d love to be wrong about this). In fact, because dB is logarithmic, any average will be biased toward the higher values (as exponentially higher the numbers, higher the logarithms), and even with this boost, 66.41 dB(A) Leq over five hours is not ‘illegal enough’.

Featured image from Wikimedia Commons

India's OA policy: Learning from Ioannidis

India’s first Open Access policy was drafted by a committee affiliated with the Departments of Biotechnology and Science & Technology (DBT/DST) in early 2014. It hasn’t been implemented yet. Its first draft accepted comments on its form and function on the DBT website until July 25; the second draft was released last week and is open for comments until November 17, 2014. If it comes into effect, it could really expand the prevalence of healthy research practices in the Indian scientific community at a time when the rest of the world is handicapped by economies of scale and complexity to mandate their practice.

The policy aspires to set up a national Open Access repository, akin to PubMed for biomedical sciences and arXiv for physical sciences in the West, that will maintain copies of all research funded in part or in full by DBT/DST grants. And in the spirit of Open Access publishing, its contents will be fully accessible free of charge.

According to the policy, if a scientist applies for a grant, he/she must provide proof that previous research conducted with grants has been uploaded to the repository, and the respective grant IDs must be mentioned in the uploads. Moreover, the policy also requires institutions to set up their own institutional repositories, and asks that the contents of all institutional repositories be interoperable.

The benefits of such an installation are many and great. It would solve a host of problems that are starting to become more intricately interconnected and giving rise to a veritable Gordian knot of stakeholder dynamics. A relatively smaller research community in India can avoid this by implementing a few measures, including the policy.

For one, calls for restructuring the Indian academic hierarchy have already been made. Here, even university faculty appointments are not transparent. The promotion of scientists with mediocre research outputs to top administrative positions stifles better leaders who’ve gone unnoticed, and their protracted tenancy at the helm often stifles new initiatives. As a result, much of scientific research has become the handmaiden of defence research, if not profitability. In the biomedical sector, for example, stakeholders desire reproducible results to determine profitable drug targets but become loth to share data from subsequent stages of the product development cycle because of their investments.

There is also a bottleneck between laboratory prototyping and mass production in the physical sciences because private sector participation has been held at bay by concordats between Indian ministries. In fact, a DST report from 2013 concedes that the government would like to achieve 50-50 investment from private and public sectors only by 2017, while the global norm is already 66-34 in favour of private.

In fact, these concerns have been repeatedly raised by John Ioannidis, the epidemiologist whose landmark paper in 2005 about the unreliability of most published medical findings set off a wave of concern about the efficiency of scientific research worldwide. It criticized scientists’ favouring positive, impactful results even where none could exist in order to secure funding, etc. In doing so, however, they skewed medical literature to paint a more revolutionary picture than prevailed in real life, and wasted an estimated 85% of research resources in the process.

Ioannidis’s paper was provocative not because it proclaimed the uselessness of a lot of medical results but because it exposed the various mechanisms through which researchers could persuade the scientific method to yield more favourable ones.

He has a ‘sequel’ paper published on the 10th anniversary of the Open Access journal PLOS Med on October 19. In this, he goes beyond specific problems – such as small sample sizes, reliance on outdated statistical measures, flexibility in research design, etc. – to showcase what disorganized research can do to undermine itself. The narrative will help scientists and administrators alike design more efficient research methods, and so also help catalyse the broad-scale adoption of some practices that have until now been viewed as desirable only for this or that research area. For India, implementing its Open Access policy could be the first step in this direction.

Making published results – those funded in part or fully by DBT/DST grants – freely accessible has been known to engender practices like post-publication peer-review and sharing of data. Peer-review is the process of getting a paper vetted by a group of experts before publication in a journal. Doing that post-publication is to invite constructive criticism from a wider group of researchers as well as exposing the experimental procedures and statistical analyses. This in turn inculcates a culture of replication – where researchers repeat others’ experiments to see if they can reach the same conclusions – that reduces the prevalence of bias and makes scientific research as a whole more efficient.

Furthermore, requiring multiple institutional repositories to be interoperable will spur the development of standardised definitions and data-sharing protocols. It will also lend itself to effective data-mining for purposes of scientometrics and science communication. In fact, the text and metadata harvester described in the policy is already operational.

Registration of experiments, which is the practice of formally notifying an authority that you’re going to perform an experiment, is also a happy side-effect of having a national Open Access repository because it makes public funds more tractable, which Ioannidis emphasizes on. By declaring sources of funding, scientists automatically register their experiments. This could siphon as-yet invisible null and negative results to the surface.

A Stanford University research team reported in August 2014 that almost 67% of experiments (funded by the National Science Foundation, USA) that yielded null results don’t see the light of day while only 21% of those sent to journals are published. Contrarily, 96% of papers with strong, positive results are read and 62% are published. As a result, without prior registration of experiments, details of how public funds are used for research can be distorted, detrimental to a country that actually requires more oversight.

It is definitely foolish to assume one policy can be panacea. Ioannidis’s proposed interventions cover a range of problems in research practices, and they are all difficult to implement at once – even though they ought to be. But to have a part of the solution capable of reforming the evaluation system in ways considered beneficial for the credibility of scientific research but delaying its implementation will be more foolish. Even if the Open Access policy can’t acknowledge institutional nepotism or the hypocrisy of data-sharing in biomedical research, it provides an integrated mechanism to deal with the rest. It helps adopt common definitions and standards; promotes data-sharing and creates incentives for it; and emphasizes the delivery of reproducible results.

Second draft of India's OA policy open for comments

The second draft of India’s first Open Access policy is up on the Department of Biotechnology (DBT) website. Until November 17, 2014, DBT Adviser Mr. Madhan Mohan will receive comments on the policy’s form and function, after which a course for implementation will be charted. The Bangalore-based Center for Internet and Society (CIS), a non-profit research unit, announced the update on its website while also highlighting some instructive differences between the first the second drafts of the policy.

The updated policy makes it clear that it isn’t concerned about tackling the academic community’s prevalent yet questionable reliance on quantitative metrics like impact-factors for evaluating scientists’ performance. Prof. Subbiah Arunachalam, one of the members of the committee that drafted the policy, had already said as much in August this year to this blogger.

The draft also says that it will not “underwrite article-processing charges” that some publishers charge to make articles available Open Access. The Elsevier Publishing group, which publishes 25 journals in India, has asked for a clarification on this.

Adhering to the policy’s mandates means scientists who have published a paper made possible by Departments of Biotechnology and Science & Technology should deposit that paper in an Open Access repository maintained either by the government or the institution they’re affiliated with.

They must do so within two weeks of the paper being accepted for publication. If the publisher has instituted an embargo period, then the paper will be made available on the repository after the embargo lifts. CIS, which advised the committee, has recommended that this period not exceed one year.

As of now, according to the draft, “Papers resulting from funds received from the fiscal year 2012-13 onwards are required to be deposited.” A footnote in the draft says that papers under embargo can still be viewed by individuals if the papers’ authors permit it.

The DBT repository is available here, and the DST repository here. All institutional repositories will be available as sub-domains on sciencecentral.in (e.g., xyz.sciencecentral.in), while the domain itself will lead to the text and metadata harvester.

The drafting committee also intends to inculcate a healthier Open Access culture in the country. It writes in the draft that “Every year each DBT and DST institute will celebrate “Open Access Day” during the International Open Access Week by organizing sensitizing lectures, programmes, workshops and taking new OA initiatives.”

'When you change something in a virus, you lose something else'

The contents of this blog post should have come out earlier (in a different form) but better late than never, eh? The Ebola outbreak has been more threatening than ever of going out of control (even as whether we’re really in control now is doubtful). As doctors and healthcare workers grappled with containment in West Africa, Michael Osterholm, the director of the Center for Infectious Diseases Research and Policy, University of Minnesota, wrote an alarmist opinion piece in The New York Times on September 11 that was more panic-mongering than instigatory. The thrust of Osterholm’s argument was:

The second possibility is one that virologists are loath to discuss openly but are definitely considering in private: that an Ebola virus could mutate to become transmissible through the air. … If certain mutations occurred, it would mean that just breathing would put one at risk of contracting Ebola. Infections could spread quickly to every part of the globe, as the H1N1 influenza virus did in 2009, after its birth in Mexico.

Sometime soon after, I spoke to a virologist at Columbia University, Dr. Vincent Racaniello, about Osterholm’s statements. I picked out Dr. Racaniello after stumbling on his virology blog (bookmark it, it’s very insightful) which at the time appeared to be one of the few voices of reason advocating caution in the face of the outbreak and pushing against the notion of an airborne Ebola virus with some crucial facts. Below, I reproduce parts of our conversation that address the nature of such facts and how they should guide us.

Note: For the TL;DR version, scroll right to the bottom.

What we know about Ebola based on what we’ve learnt from studying viruses

Some viruses are studied more than others because of their impact on human health. HIV, influenza, the herpes viruses… Herpes viruses infect almost every person on the Earth; influenza infects hundreds of thousands every year; HIV has infected millions and millions of people – so those get most of the attention, so people work on them a lot. Some of the things you find may be generalizable, such as the general need of a virus to get inside of a cell, replicate its genome. But each virus has specifics. Each is very different, the genome is different, the way the genome is encased is different, the way it gets into cells is different, and the ways they spread from person to person are often very different.

For example, if you study transmission of the influenza virus in an animal model, you may learn what controls the transmission of those viruses through the air, but you can’t assume that’s going to be the same for the Ebola virus. So people make the mistake of saying “Because this virus does this, then that virus must do the same thing”. That’s not correct. Unfortunately, it makes it complicated because every virus needs to be studied on its own. We can’t study influenza and hope to prevent Ebola.

How viruses evolve to become deadlier

From what we have seen, if you gain a function, you typically lose something else. When humans impose genetic changes on viruses, they’re doing so from their point of view as opposed to the way it happens in nature, where evolution does the job. When a virus in nature somehow evolves and becomes transmissible in some species, it’s because the virus with the right genome has been selected as opposed to in the lab where a human puts one or two mutations in a virus and gets a phenotype. We don’t know how to achieve gain-of-function in viruses in the lab. We have a lot of hubris, we think we can do anything with viruses. We introduce an amino acid change but who knows what it’s doing to the virus.

What we’ve observed over the years is that when you introduce changes in the virus in the laboratory to get a new property that you want, you lose something else. In terms of transmission, there haven’t been that many transmission experiments done with viruses to understand what controls transmission. H5N1 – avian influenza – ferrets is really the only one – and there, the gain of aerosol transmission caused the loss of virulence. It’s probably because you need other changes to compensate what you’ve done but we’re only looking at transmission.

In nature, perhaps that would be taken care of, so that’s why I say when you change something in a virus you lose something else. But this is not to say that this is always going to be the case. You can’t predict in viruses – you can’t predict in science, often – what’s going to happen. But what we can do is use what we know and use that to inform our thinking. For example, in nature, influenza viruses are very nicely transmitted, but they’re not all that virulent. They don’t have a 90% case-fatality ratio like Ebola, so I think there’s something there that tells us that aerosol transmission is a difficult thing to achieve. But we don’t know what will happen.

An Ebola virus virion.
An Ebola virus virion. Image: CDC/Wikimedia Commons

About what other evolutionary pathways Ebola has at its disposal

Viruses can be transmitted in a number of ways. They can be transmitted through the air, they can be transmitted by close contact of various sorts, they can be transmitted by body fluids, they can be transmitted by sexual contact, intravenous drug use, mother to child during birth, they can be transmitted by insect vectors, and of course some can be transmitted in our DNA – 8% of our genome is a virus. We have never seen a human virus change the way it’s transmitted. Once a virus has already been in people, we have never seen it change.

We’ve been studying viruses for just over a 100 years which is admittedly not a long time – viruses have probably been around since the beginning of the Earth, billions of years – but we go based on what we know, and we’ve never seen a virus change it’s mode of transmission. I’m not particularly worried about Ebola changing its routes of transmission. Right now, it’s spreading by close contact from person to person via body fluids and I think it’s going to stay that way. I don’t think we need to worry about it being picked up by a mosquito for example – that’s very difficult to do because then the virus would have to replicate in the mosquito and that’s a big challenge. And who knows, if it acquired that, what other property would be compromised.

What, according to Dr. Racaniello, we need to focus on

I think we need to really bear down on stopping transmission. It can be done, it’s not going to be easy, but it’s going to require other countries helping out because these West African countries can’t do it themselves. They don’t have a lot of resources and they’re losing a lot of their healthcare people from the epidemic itself. I don’t see what worrying about aerosol transmission would do. I don’t see it changing the way we treat the outbreak at all. I think right now we need to get vaccines and antivirals approved, so that we can get in there and use them. In the meantime, we need to try and interrupt transmission. In past outbreaks, interrupting it has been the way to stop the outbreaks. Admittedly, they’ve been a lot smaller, easier to contain. But SARS infected 10,000 people globally and it was contained by very stringent measures. That was a virus that did transmit by aerosol. So it can be done – it’s just a matter of getting everyone cooperating to do it.

If a virus can become more transmissible after infecting a human population

If you saw the movie ‘Contagion’ – in this movie, the virus mutated and increased its reproductive index, which I thought was one of the weaknesses of the movie. We’ve never seen that happen in nature, which is not to say that it hasn’t. When a virus starts circulating in people, it has everything it needs to circulate effectively. Often, people will bring the 1918 influenza virus which seemed to get more virulent as the outbreak continued but back then we hadn’t even isolated the influenza virus. It wasn’t isolated until 1933. So there’s just no way we can make definitive statements about what did or didn’t happen, but people speculate all the time.

I wish we could go back in time and sample all the viruses that have been out there but we’re going to have to see it happen. For that same reason, no virus has ever changed its transmission route in people. If it had, we could have taken the virus before and after the change and sequence it and say, “Aha! This is what’s important for this kind of transmission!” We don’t have that information so we depend on animals for this.

TL;DR:

  • We can’t study influenza and hope to prevent Ebola.
  • When you introduce changes in the virus in the laboratory to get a new property that you want, you lose something else.
  • In nature, influenza viruses are very nicely transmitted, but they’re not all that virulent. I think there’s something there that tells us that aerosol transmission is a difficult thing to achieve.
  • No virus has ever changed its transmission route in people.
  • SARS infected 10,000 people globally and it was contained by very stringent measures. That was a virus that did transmit by aerosol. So it can be done – it’s just a matter of getting everyone cooperating to do it.

Europa’s ice shell could be quaking

Even before astronomers noticed last year that Europa was spouting jets of water vapor from its icy surface, they thought there was something shifty about Jupiter’s moon. While the 66 other Jovian moons are pitted with craters, Europa sports some unusual blemishes: an abundant crisscrossing of ridges tens of kilometres long. Many are abruptly interrupted by smooth ice patches.

Two geologists think they can explain why. Backed by photos taken by the Galileo space probe, they suggest Europa’s thick shell isn’t continuous but is made up of distinct plates of ice. These plates move away from each other in some places, exposing gaps which are then filled by deeper ice rising upward. In other places they slide over each other and push surface ice downward and form ridges.

“We knew that stuff has been moving over the surface, and up from beneath and breaking through, but we weren’t able to figure where all the older stuff was going,” said study coauthor Dr. Louise Prockter, a planetary scientist at Johns Hopkins. “We’ve found for the first time evidence that material is going back into the interior.” The study was published last month in Nature Geoscience.

On Earth, this kind of tectonic activity replenishes compounds necessary for life, such as carbon dioxide, by letting them move up from the interior through fissures to the surface. Now, scientists say a similar mechanism could apply to Europa. Astronomers think the moon harbors a subsurface ocean of liquid water that feeds the vapor plumes, and could be habitable.

“It’s certainly significant to find another solid body in the solar system that undergoes some kind of surface recycling,” said Peter Driscoll, a planetary scientist at the University of Washington who was not involved in the study.

Prockter, together with Simon Kattenhorn, a geologist at the University of Idaho, Moscow, worked with photographs of a part of Europa’s surface covering 20,000 km2. The pictures were shot by Galileo when it orbited Jupiter from 1995 to 2003.

“We go in using something like Photoshop and start cutting the image up,” Dr. Prockter explained. They then pieced them back together so that the crisscrossing ridges lined up end-to-end, and compared what they had to the surface as it is today.

“Once we started doing the reconstruction, we ended up with a big gap right in the middle,” she said.

The researchers concluded the missing bit had dived beneath another plate.

Although only some of Galileo’s photographs were at a resolution high enough to be useful for the study, Dr. Prockter said it was unlikely that their finding was a one-off because signs of displacement were visible all over Europa’s surface.

Nevertheless, Dr. Driscoll cautioned against using Earth’s tectonic activity as a model for Europa’s. “There are a number of missing features” that define tectonics on Earth, he said, such as arc volcanos and continents. “And many of the properties of Earth’s features may not be expected for an icy shell like Europa, where the materials are extremely different.”

A better gauge of these disparities might be a probe to the Jovian moon that NASA has planned for the mid-2020s.

“I think the timing right now is very important,” said Candice Hansen, a member of NASA’s Planetary Science Subcommittee. She says the Europa study will help scientists working on the probe secure the requisite funding and commitment from Congress.

“I am very enthusiastic about a mission to Europa, and this exciting result is one more reason to go,” she said.

Artist's concept of the Europa Clipper mission investigating Jupiter's icy moon Europa.
Artist’s concept of the Europa Clipper mission investigating Jupiter’s icy moon Europa. Image credit: NASA/JPL-Caltech

Why you should care about the mass of the top quark

In a paper published in Physical Review Letters on July 17, 2014, a team of American researchers reported the most precisely measured value yet of the mass of the top quark, the heaviest fundamental particle. Its mass is so high that can exist only in very high energy environments – such as inside powerful particle colliders or in the very-early universe – and not anywhere else.

For this, the American team’s efforts to measure its mass come across as needlessly painstaking. However, there’s an important reason to get as close to the exact value as possible.

That reason is 2012’s possibly most famous discovery. It was drinks-all-round for the particle physics community when the Higgs boson was discovered by the ATLAS and CMS experiments on the Large Hadron Collider (LHC). While the elation lasted awhile, there were already serious questions being asked about some of the boson’s properties. For one, it was much lighter than is anticipated by some promising areas of theoretical particle physics. Proponents of an idea called naturalness pegged it to be 19 orders of magnitude higher!

Because the Higgs boson is the particulate residue of an omnipresent energy field called the Higgs field, the boson’s mass has implications for how the universe should be. Being much lighter, physicists couldn’t explain why the boson didn’t predicate a universe the size of a football – while their calculations did.

In the second week of September 2014, Stephen Hawking said the Higgs boson will cause the end of the universe as we know it. Because it was Hawking who said and because his statement contained the clause “end of the universe”, the media hype was ridiculous yet to be expected. What he actually meant was that the ‘unnatural’ Higgs mass had placed the universe in a difficult position.

The universe would ideally love to be in its lowest energy state, like you do when you’ve just collapsed into a beanbag with beer, popcorn and Netflix. However, the mass of the Higgs has trapped it on a chair instead. While the universe would still like to be in the lower-energy beanbag, it’s reluctant to get up from the higher-energy yet still comfortable chair.

Someday, according to Hawking, the universe might increase in energy (get out of the chair) and then collapsed into its lowest energy state (the beanbag). And that day is trillions of years away.

What does the mass of the top quark have to do with all this? Quite a bit, it turns out. Fundamental particles like the top quark possess their mass in the form of potential energy. They acquire this energy when they move through the Higgs field, which is spread throughout the universe. Some particles acquire more energy than others. How much energy is acquired depends on two parameters: the strength of the Higgs field (which is constant), and the particle’s Higgs charge.

The Higgs charge determines how strongly a particle engages with the Higgs field. It’s the highest for the top quark, which is why it’s also the heaviest fundamental particle. More relevant for our discussion, this unique connection between the top quark and the Higgs boson is also what makes the top quark an important focus of studies.

Getting the mass of the top quark just right is important to better determining its Higgs charge, ergo the extent of its coupling with the Higgs boson, ergo better determining the properties of the Higgs boson. Small deviations in the value of the top quark’s mass could spell drastic changes in when or how our universe will switch from the chair to the beanbag.

If it does, all our natural laws would change. Life would become impossible.

The American team that made the measurements of the top quark used values obtained from the D0 experiment on the Tevatron particle collider, at the Fermi National Accelerator Laboratory. The Tevatron was shut in 2011, so their measurements are the collider’s last words on top quark mass: 174.98 ± 0.76 GeV/c2 (the Higgs boson weighs around 126 GeV/c2; a gold atom, considered pretty heavy, weighs around 210 GeV/c2). This is a precision of better than 0.5%, the finest yet. This value is likely to be updated once the LHC restarts early next year.

Featured image: Screenshot from Inception

A standout technology prize

The Nobel Prize award ceremony, Stockholm, 2007.
The Nobel Prize award ceremony, Stockholm, 2007. Image: nobelprize.org

Once a year, the Nobel Prize in physics triggers a burst of science news coverage in the media, giving some decades-old invention or discovery more than its 15 minutes’ due on a channel, paper or portal that might have otherwise never bothered about it. Despite its abundant quirks, the prize, the consequent celebration and the subsequent snubs do make for good news.

But this year’s prize may have been a little different. It was awarded to the inventors of the blue-light-emitting diodes (blue LEDs). LEDs that emit the two other primary colors, green and red, were easier to produce. The higher frequency blue emitter proved to be the stumbling block before this year’s Japanese and American Laureates succeeded in the late 1980s. By combining the three colors, the white LED emerged and became the device to, as the Nobel Prize Committee is only too happy to proclaim, power the 21st century.

The reason it’s different is because it draws attention to an arguably understated engineering development. The Nobel Prize Committee has not had any a perceptible bias toward or against engineering, specifically materials science. The 2000, 2001, 2003, 2007, 2009 and 2010 physics prizes, to choose from the last decade, lauded accomplishments in engineering/materials science. However, unlike this year’s recipient, those accomplishments became very popular and entered mainstream public consciousness by the time their significance had been recognized for a Nobel Prize. In fact, that has been the attitude of most prize-winning discoveries: scientifically significant as well as being novelty heavyweights.

In contrast, this year’s prize was more for the achievement of synthesizing gallium-nitride (GaN), the compound semiconductor at the heart of blue LEDs and whose success story hasn’t quite been one for the romantic science books. It’s possible that it could be the blue LED – or LEDs for that matter – didn’t need romanticization, that the pursuit for it has already been justified to the common man by giving him a cheap, “energy-saving” light-bulb. It could be its technology was so sought-after that it was only too successful in transcending the boundaries between discovery and mass utilization.

Since it was first awarded in 1901, the Nobel Prize in physics has recognized 114 discoveries (75%) and but only 39 inventions (25%; including blue LEDs)*. On the other hand, this century has seen a higher incidence of inventors among Laureates as well as recognizes more and more recent inventions. The blue LEDs (2014) emerged in the late 1980s; Wineland and Haroche’s particle-manipulators (2012) were used in the late 1980s; graphene (2010) was first produced in 2003; the CCD sensor (2009) was invented in 1969; the frequency comb (2005) was perfected in the 1990s; the achievement of Bose-Einstein condensates (2001) was in 1995; and so forth.

This may well be the Nobel Prize in physics Committee’s way of acknowledging the dominance of technology. It could also be our window to understanding how award-winning science of the previous century is shaping the award-winning technology of the last three decades.


*Determined based on Nobel Prize citations. For Laureates whose citations were ambiguous, such as “contributed to the development of”, etc., the nature of work was assumed to be both an invention as well as a discovery.