Has 'false balance' become self-evidently wrong?

Featured image credit: mistermoss/Flickr, CC BY 2.0.

Journalism’s engagement with a convergent body of knowledge is an interesting thing in two ways. From the PoV of the body, journalism is typically seen as an enabler, an instrument for furthering goals and which is adjacent at best until it begins to have an adverse effect on the dominant forces of convergence. From the PoV of journalism, the body of knowledge isn’t adjacent but more visceral – the flesh with which the narratives of journalistic expression manifest themselves. Both perspectives are borne out in the interaction between anthropogenic global warming (AGW) and its presence in the news. Especially from the PoV of journalism, covering AGW has been something of a slow burn because the assembly of its facts can’t be catalysed even as it maintains a high propensity to be derailed, requiring journalists to maintain a constant intensity over a longer span of time than would typically be accorded to other news items.

When I call AGW a convergent body of knowledge, I mean that it is trying to achieve consensus on some hypotheses – and the moment that consensus is achieved will be the point of convergence. IIRC, the latest report from the Intergovernmental Panel on Climate Change says that the ongoing spate of global warming is 95% a result of human activities – a level of certainty that we’ll take to be just past the point of convergence. Now, the coverage of AGW until this point was straightforward, that there were two sides which deserved to be represented equally. When the convergence eliminated one side, it was a technical elimination, a group of fact-seekers getting together and agreeing that what they had on their hands was indeed a fact even if they weren’t 100% certain.

What this meant for journalism was that its traditional mode of creating balance was no longer valid. The principal narrative had shifted from being a conflict between AGW-adherents and AGW-deniers (“yes/no”) to becoming a conflict between some AGW-adherents and other AGW-adherents (“less/more”). And if we’re moving in the right direction, less/more is naturally the more important conflict to talk about. But post-convergence, any story that reverted to the yes/no conflict was accused of having succumbed to a sense of false balance, and calling out instances of false balance has since become a thing. Now, to the point of my piece: have we finally entered a period wherein calling out instances of false balance has become redundant, wherein awareness of the fallacies of AGW-denial has matured enough for false-balance to have become either deliberate or the result of mindlessness?

Yes. I think so – that false-balance has finally become self-evidently wrong, and to not acknowledge this is to concede that AGW-denial might still retain some vestiges of potency.

I was prompted to write this post after I received a pitch for an article to be published on The Wire, about using the conclusions of a recently published report to ascertain that AGW-denial was flawed. In other words: new data, old conclusions. And the pitch gave me the impression that the author may have been taking the threat of AGW-deniers too seriously. Had you been the editor reading this, would you have okayed the piece?

We’ve become more ambitious about reaching Alpha Centauri – what changed?

Yuri Milner’s announcement last night that he’s investing $100 million into figuring out how thousands of chip-sized probes could be sent to the Alpha Centauri star system in 20 years must’ve felt like the future to many. The proposal, titled Starshot, imagines the probes to be fitted with small ‘sails’ a few hundred atoms thick that could be propelled by a powerful array of lasers fired from Earth to 60,000 km/s. And once they get to the Alpha Centauri stars A or B, they could take images with a 2 MP camera and transmit them to Earth through an optical communications channel. The radical R&D developed on the way to achieving these big goals could also be deployed to visiting planets within the Solar System in a matter of hours to days, as well as using the lasers and their optical systems to study asteroids and stars.

But strip away the radicalness and Starshot begins to resemble pieces of the previous century as well, pieces that make for a tradition in which Milner is only the latest, albeit most prominent, participant, and which provide an expanded frame of reference to examine what had to change for astronomers to dream of literally reaching for the stars. The three most prominent pieces are Orion, Daedalus and Daedalus’s derivative, Longshot. All three were reliant on technology that didn’t exist but soon would. Daedalus and Longshot in particular wanted to send unmanned probes to nearby stars within a 100 years. And even more specifically, Longshot is closer to Milner’s idea in its aspiring to:

  • Launch the probes from space, not from the ground
  • Envision and build a more efficient kind of propulsion, and
  • Send unmanned probes to Alpha Centauri

Commissioned by the British Interplanetary Society and led by an engineer named Alan Bond, Daedalus involved the designing of an unmanned probe that could be sent to Barnard’s Star, 5.9 lightyears away, in 50 years using contemporary technological ideas. At the time, in 1973-1978, the most efficient such idea was nuclear fusion – it was only two decades before that when another nuclear-fusion-propelled rocket, called Project Orion, had been under consideration by the physicists Theodore B. Taylor and Freeman Dyson. Because of the amount power and thrust produced by fusion was known to be very high, Bond’s Daedalus could be massive as well.

The size of the (unbuilt) Daedalus starship compared to the Empire State Building, which is 443 metres tall. Credit: Adrian Mann/bisbos.com
The size of the (unbuilt) Daedalus starship compared to the Empire State Building, which is 443 metres tall. Credit: Adrian Mann/bisbos.com

Conventional rockets that burn high-grade fuel in large quantities in a short span of time to get off the ground are limited by the pesky Tsiolkovsky rocket equation. At its simplest, the equation describes how a rocket that wants to carry more payloads must also carry more fuel to carry those payloads, in turn becoming even heavier, in turn having to carry even more fuel, in turn becoming even heavier, in turn having to carry even more fuel, and so forth… On the other hand, a small nuclear power plant carried onboard Daedalus’s rocket would seem to provide quick escape from the Tsiolkovsky problem – compressing small amounts of helium-3 (mined from the moon or the surface of Jupiter by hot-air balloons) using electron beams to yield 250 pellet-detonations/second sustained for 3.8 years. So, the review concluded with a suggestion that Project Daedalus’s rocket could weigh 54,000 tonnes (including fuel).

In 1986, interest in this idea was revived by a report of the US National Space Society’s National Commission on Space. It recommended focused research into human spaceflight, efficient and sustainable fuel options, astrometry and interstellar research unto developing a “pioneering” American mission in the early 21st century. Particularly, it suggested developing a “long-life, high-velocity spacecraft to be sent out of the Solar System on a trajectory to the nearest star”. These ideas were advanced by a team of engineering students and NASA scientists, working with the US Naval Academy, which published its own 74-page report in 1987 titled ‘Project Longshot: An Unmanned Probe to Alpha Centauri’. Like Daedalus, Longshot aspired to use inertial confinement fusion – using energetic beams of particles to confine pellets of helium-3 and deuterium to fusionable levels – but with four key differences.

First, instead of obtaining helium-3 from the moon or the clouds of Jupiter, the team suggested using particle accelerators. Second, the nuclear reactions would be executed at small scales within a “pulsed fusion microexplosion drive”. A magnetic casing surrounding the drive chamber would then channel out a stream of charged particles produced in the reaction to create thrust. Third, while Daedalus would fly by the Alpha Centauri system’s stars (and release smaller probes), Longshot was designed to be able to get into orbit around the star Alpha Centauri B. Fourth, and most important, the fusion reactor was not to be used to launch Longshot from the ground nor away from Earth but only to propel it through space over 100 years. This was because of the risk of the fusion reactor going out of control in or close to Earth’s atmosphere. The team recommended sending each of the spacecraft’s modules to a space station (the ISS would be built a decade later), then assembling and launching it from there.

Illustrations showing how the inertial fusion reactions inside Longshot’s reactors would power the rocket. Credit: stanford.edu
Illustrations showing how the inertial fusion reactions inside Longshot’s reactors would power the rocket. Credit: stanford.edu

The report was published only three years after the sci-fi writer Robert L. Forward described his idea of the Starwisp, a satellite fit with a sail that would be pushed on by beams of microwave radiation shot from Earth – much like Starshot. However, the Longshot report devotes three pages to discussing why a “laser-pumped light sail” might not be a good idea. The authors write: “The single impulse required to reach the designated system in 100 years was determined to be 13,500 km/sec. The size of a laser with continuous output, to accelerate the payload to 13,500 km/sec in a year, is 3.75 Terra Watts.” The payload mass was considered to be 30,000 kg. The specific impulse (Isp, evaluated in the table below) is measured in seconds – it’s the duration over which a rocket engine must be fired in order to achieve a proportional amount of thrust given the fuel is flowing into the engine at a fixed rate.

Table considering trade-offs between various propulsion options and their feasibilities. Credit: stanford.edu
Table considering trade-offs between various propulsion options and their feasibilities. Credit: stanford.edu

The Starshot team works around this problem by shrinking the size of the spacecraft (StarChip) to a little bigger than a postage stamp – allowing a 100-gigawatt laser to propel it to very large velocities in a few dozen minutes. (A back-of-the-envelope calculation discounting effects of the atmosphere shows a 50-MW laser working at 100% efficiency will suffice to push a 10-gram StarChip + sail to 60,000 km/s in 30 minutes.) Another boon of the small size is that the power required to operate such instruments is very low, whereas Longshot required a 300-kW fission reactor to power its payload. So, the real innovation on this front is not in terms of propulsion or even the lasers but of the miniaturisation of electronics.

One of the scientists involved in the StarChip team is Zachary R. Manchester, who launched a project called KickSat in 2011 and which was selected for the NASA CubeSat Launch Initiative in 2015. A press statement accompanying the selection reads: “the Sprite [a single “ChipSat”] is a tiny spacecraft that includes power, sensor and communication systems on a printed circuit board measuring 3.5 by 3.5 centimetres with a thickness of a few millimetres and a mass of a few grams”. Each StarChip is likely to turn out to be something similar.

A batch of KickSat Sprites lying on a table. Credit: @zacinaction on Twitter
A batch of KickSat Sprites lying on a table. Credit: @zacinaction on Twitter
A preview of KickSat-2 showing its component circuits. Credit: @spacecraftlab on Twitter
A preview of KickSat-2 showing its component circuits. Credit: @spacecraftlab on Twitter

However, the miniaturisation of electronics doesn’t solve the other problems the Longshot team anticipated, and which Milner’s team has chosen to overlook in favour of compensatory correcting techniques. The biggest among them is deceleration. Even the Longshot’s 100-year transit from a space station in Earth-orbit to a star 4.37 lightyears away consists of accelerating for about 71 years followed by 29 years of deceleration. In contrast, the StarChip fleet won’t decelerate but pretty much sneeze past Alpha Centauri, making rapid well-timed measurements.

Another problem is whether the StarChip sails will be able to withstand the effects of being hit by a 100-GW laser. Recent sail-based experiments like IKAROS and LightSail-1 have demonstrated their feasibility, at least when it comes to being propelled by photons streaming out from the Sun, as well as engineering limitations. Borrowing on lessons from these missions, the Starshot collaboration has proposed that a suitable metamaterial (composed of various materials) be built to be extremely reflective and absorb as few of the laser’s photons as possible. According to a calculation on the website, absorbing even 1-10,000th of the laser’s energy will mean the sail becoming quickly overheated, and that this has to be reduced to a billionth. In fact, as is often overlooked, having endless possibilities also means having endless engineering challenges, and there are enough for Breakthrough Starshot to warrant the $100-million from Milner.

What makes the project truly exciting is its implicit synecdoche – that none of its challenges are real deal-breakers even as surmounting all of them together would give birth to a wild new idea in interstellar research. Unlike Orion, Daedalus or Longshot, Starshot stays clear of the controversies and technical limitations attendant to nuclear power, and is largely divorced from political considerations apropos research-funding. Most importantly, in hindsight, Starshot isn’t proposing a bigger-therefore-better idea, rather taking a break from the past to better leverage our advanced abilities to manipulate materials as well as showing a way out of Tsiolkovsky’s tyranny (even with a nuclear engine, Daedalus’s conceivers suggested the rocket carry 50,000 tonnes of fuel – and it represented a more serious design effort than what went into Longshot). As with all human enterprises, Starshot is worth celebrating if only for the drastic leap in efficiency it represents.

The Wire
April 13, 2016

The INO story

A longer story about the India-based Neutrino Observatory that I’d been wanting to do since 2012 was finally published today (to be clear, I hit the ‘Publish’ button today) on The Wire. Apart from myself, four people worked on it: two amazing reporters, one crazy copy-editor and one illustrator. I don’t mean to diminish the role of the illustrator, especially in setting the piece’s mood quite well, but only that the reporters and the copy-editor did a stupendous job of getting the story from 0 to 1. After all, all I’d had was an idea.

The INO’s is a great story but stands unfortunately to become a depressing parable at the moment – the biggest bug yet in a spider’s web spun of bureaucracy and misinformation. As told on The Wire, the INO is India’s most badass science experiment yet but its inherent sophistication has become its strength and weakness: a strength for being able yield cutting-edge scientific, a weakness for being the ideal target of stubborn activism, unreason and, consequently and understandably, fatigue on the part of the physicists.

Here on out, it doesn’t look like the INO will get built by 2020, and it doesn’t look like it will be the same thing it started out as when it does get built. Am I disappointed by that? Of course – and bad question. I’m rooting for the experiment, yes? I’m not sure – and much better question. In the last few years, in which the project’s plans gained momentum, some unreasonable activists were able to cash in on the Department of Atomic Energy’s generally cold-blooded way of dealing with disagreement (the DAE is funding the INO). At the same time, the INO collaboration wasn’t as diligent as it ought to have been with the environmental impact assessment report (getting it compiled by a non-accredited agency). Finally, the DAE itself just stood back and watched as the scientists and activists battled it out.

Who lost? Take a guess. I hope the next Big Science experiment fares better (I’m probably not referring to LIGO because it has a far stronger global/American impetus while the INO is completely indigenously motivated).

Podile, plagiarism, politics

On 17 January, Vemula hung himself, saying in his suicide note, “my birth is my fatal accident.” His death has rocked academia, with unabated protests on the Hyderabad campus and elsewhere. Even before the incident, Tandon and others openly referred to Appa Rao Podile, the university’s vice chancellor, as the famed institution’s first political appointee. Appa Rao has since left the university on indefinite leave.

This is from a February 3, 2016, blog post on Scientific American by Apoorva Mandavilli. I quote this to answer the question I’ve been asked throughout today from different people: “Why did you not publish your piece on three of Appa Rao’s papers containing plagiarised content earlier or later?” (The link to my piece, which appeared on The Wire, is here.) Appa Rao, as Mandavilli writes, is the university’s first VC to be appointed via the political route. In fact, according to The Times of India, he once campaigned for the Telugu Desam Party.

My piece was put together over three or four days – from the time I found out about the issues in Appa Rao’s papers to when I had all the information necessary to put my article together. Finally, though its publication date was postponed by a day thanks to the release of the Panama Papers, nothing else was taken into account apart from checking if The Wire had done due diligence before hitting the ‘Publish’ button. Having said all this, I ask: if Appa Rao is the first politically appointed VC at the University of Hyderabad, how can anything he does not be examined through a political lens?