Replies to the government’s concerns with our criticism of the DNA Profiling Bill

In response to the piece ‘Modi Wants the DNA Profiling Bill Passed Right Away. Here’s Why It Shouldn’t Be‘, published July 24, 2015, Dr. J. Gowrishankar, Director of the Centre for DNA Fingerprinting and Diagnostics, wrote a spirited response describing the benign intentions behind the Bill and why there is a real need for it in India, where the criminal justice system is known to be tardy.

I agree with large sections of his response, but am disappointed that they don’t address any specific points of failures – especially the lacklustre privacy and accountability safeguards. This is also why I don’t ask for the Bill to be shredded but that it be referred to a Parliamentary Standing Committee (at least) before it can be tabled. The following is a an unnumbered ‘listicle’ of my replies to Gowrishankar’s response.

That a part of the Indian Bill’s strength lies in having borrowed parts of laws from other countries, where DNA profiling has been around for more than a decade

The text of India’s Human DNA Profiling Bill may in large part be based on that from the USA, UK, Canada, etc., but many of the problems that the Bill could exacerbate are unique to India – such as the many privacy and accountability concerns highlighted in my article. Those parts of the Bill can’t be compared to what’s happening in the West. In fact, the USA, UK and Canada also have legislations in place that explicitly specify how the DNA profiles can be collected, the best practices for storing and indexing them, as well as who can access them, in what circumstances and how. TheDNA Identification Act 1994 (USA) specifies that all federally supported DNA labs comply with operational standards for collection, storage and analysis set by the FBI. The Criminal Justice and Public Order Act 1994 does the same in the UK. The DNA Identification Act 1998 (Canada) also does the same and further requires a periodic review of itself every five years.

That DNA profiling has a steadfast record in being able to solve disputes and that my skepticism of it is misplaced

Yes, DNA-profiling has a fabulous track record in settling disputes. However, the drafting committee, as well as anyone interested in the Bill’s tabling, would do well to learn from the mistakes of those who have been systematically pressing DNA-profiling to the resolution of civil and criminal disputes in modern times. I am skeptical of the technique – as I’m skeptical of all techniques – so I’ve asked that the Bill be cognisant of the various statistical blips and prescribe best practices to eliminate them. As I write in my article: “This isn’t to say that a reliable [match] can never be arrived at, but only that the draft Bill does not have the commensurate depth required to identify and tackle the sort of statistically motivated mistakes in DNA profiling. In fact, it also abdicates itself from specifying any best practices for the collection, storage and analysis of DNA samples…”

That only identity-neutral information derived from a person’s DNA will be stored in the database

The Bill doesn’t say this. As far as the draft document is concerned, the contents of the database are profiles – not identity-neutral profiles, just profiles. I respect your attitude to privacy but I only ask that it be reflected fully in the Bill as well.

That a database of DNA profiles will only contain the profiles of offenders, missing persons, unidentified bodies and volunteers and that its regulation will, beyond the Bill’s sanctions, require judicial oversight

Bringing criminals to justice faster is a good aspiration to have, but it must be done not at the expense of anybody’s privacy and definitely while the government’s actions – in the form of the Board’s – are always accountable. On the question of retention: it’s understandable if you want to store the profiles of those who are repeat offenders – but why indefinitely? The law in the UK stipulates that profiles can be retained for a maximum of six years. And what’s the rationale behind storing the profiles of those who have been sentenced for life or to death?

That the Board has been given discretionary powers to empower them to keep up with advances in DNA profiling, and that the Board will be staffed by, for example, the Chairperson of the NHRC

Those staffing the Board may be upstanding folk but the Bill has a responsibility to account for the worst of times as well. I don’t want to have to keep a check on who’s on the Board and who’s not – I want the Bill to provide guarantees once and for all that things won’t go wrong. Please also note that the Bill is scheduled to be introduced at a time when the country’s leadership is unwilling to accept that the right to privacy is a fundamental right, at a time when the Central government insists on interfering in the management of highly regarded public institutions. I can only read the Bill’s intentions through the lens of the government that will enact and, ultimately, be responsible for enforcing it.

That the DNA profiles’ database will contain only digital information and not the physical samples from which the data has been derived

I have already stated that setting up the Indian database will incur a one-time cost of Rs.20 crore. And on the other hand, I would like you to explain who will pay for acquiring the DNA profiles at costs that could well run into thousands of crores. In fact, the Bill does not contain the word ‘cost’ in it and seems unconcerned about how its implementation will be funded.

Next, on the question of whether the DNA database will store the physical samples from which the profiles will be derived: Usha Ramanathan – a researcher and advocate who was a dissenting member of the Bill’s drafting committee – has revealed an email communication she had with Gowrishankar dated June 25, 2014, in which he states the following:

“On your question of destruction of DNA collected from the relatives, I wish to state that the CDFD has so far not destroyed any DNA sample received by it since its inception. These samples are being maintained in safe custody in the institute. Once again, it is my assessment that the policy on such destruction needs to be developed and evolved by the proposed DNA Profiling Board.”

As a result, could the costs could be comparable to the NDNAD in the UK?

That my criticism has cherry-picked facts from the Bill

I have cherry-picked facts, but never out of context (that’s the reason the article runs into 4,000 words). I still want a Human DNA Profiling Bill to be passed and agree with you that it has benefits – but it gets to them at a great cost. That’s why I’d like to repeat my statement that the Bill be referred to a Parliamentary Standing Committee, and its niggling as well as substantial issues be resolved to everyone’s satisfaction, before it’s tabled.

The Wire
July 25, 2015

Featured image credit: stewdean/Flickr, CC BY 2.0.

Here's why the Human DNA Profiling Bill shouldn't be passed in its current form

The Human DNA Profiling Bill which the Narendra Modi government wants to pass in the current session of Parliament is one of the most intrusive enactments of its kind anywhere in the world, a measure that will render obsolete the national debate on privacy before it has even begun.

Drafted by the Department of Biotechnology (DBT) in the Ministry of Science & Technology, the Bill’s pithy title belies the ambitious, even disturbing, goals that its text envisions. To be sure, that it was drafted at the outset to expedite civil and criminal disputes where possible, to help identify the unclaimed dead, and to track down missing persons is a benign, even desirable, intention to have. Where it fails is in situating this agenda in an accountable and secure framework of rules.

Once passed, the law will set up a national DNA database, a DNA Profiling Board and a mechanism for the use of DNA profiles to resolve criminal and civil disputes with few safeguards to guard against the abuse of this information.

For example, in the Bill, a version of which The Wire was able to access, the Board gives itself wide-ranging discretionary powers about whose name gets into the database (sometimes without consent), who gets to access the DNA profiles, what the database could be used for (“population” studies), and who watches the watchers (in a word, nobody) – readying a potent cocktail of abuse.

The Bill is set to be tabled in the monsoon session of Parliament, which began on July 21. But that could be too soon given the scope and seriousness of the issues the draft raises. The proposed laws’ failures broadly have four facets – reliability, costs, privacy and accountability – and if passed in its current form could gravely jeopardise the integrity of sensitive biological information as well as poison the criminal justice system with a false conviction of judicial infallibility. In the absence of a reason to expedite its passing, the draft Bill could instead be referred to a Parliamentary Standing Committee before it’s tabled.

DNA profiling

Credit: johnnieb/Flickr, CC BY 2.0.
Credit: johnnieb/Flickr, CC BY 2.0.

After human fingerprints were pressed into the service of criminal investigations in 1892, DNA profiles have been the only other biological marker discovered by scientists to be unique to each individual. Since fingerprints at a crime scene can be easily obfuscated, or not left behind at all, and it is almost impossible for a criminal to not leave behind a clue bearing his or her DNA, DNA profiling has assumed great importance in modern forensic science.

Every cell of the body contains a copy of the DNA molecule, a total of three billion base pairs of smaller molecules called nucleotides neatly arranged into structures called chromosomes. Consider this a giant word with three billion letters. Some 99.9% of those letters are identical for every individual – but that 0.01% difference amounts to three million letters that are arranged in a different configuration. Among them, there are parts that contain a short combination of letters repeated a few times. These are called short tandem repeats (STRs), and the frequency of their repetition differs from person to person so much so that no two (known) people have the same DNA overall – unless they’re identical twins or closely related. Identifying this difference forms the basis of DNA profiling, also known as DNA fingerprinting.

The idea of the Bill was first mooted by the DBT in 2003, during the National Democratic Alliance government of Atal Bihari Vajpayee. In 2007, the DNA Profiling Advisory Committee, which had been put together by the DBT, developed the Human DNA Profiling Bill 2007 that has seen changes between 2007 and 2012. In January 2013, a committee of experts was formed to scrutinise the 2012 draft: J. Gowrishankar, Director, CDFD; R.K. Gupta, adviser (C&I), Planning Commission; Jacob P. Koshy, science writer, Mint; Kamal Kumar, retd. IPS, retd. DGP of Hyderabad; C. Muralikrishna Kumar, senior adviser (ICT), Planning Commission; Usha Ramanathan, researcher and advocate; T.S. Rao, adviser, DBT; N. Madhusudan Reddy, staff scientist, CDFD; Raghbir Singh, fmr. Secy., Ministry of Law; Alka Sharma, Director, DBT.

Till late 2014, the committee continued to deliberate and make changes to the draft Bill. Then, it was circulated within the Ministry of Science & Technology for comments, which were then incorporated in the draft.

By January 2015, the revised document had wound its way to the Legislative Department of the Ministry of Law & Justice. According to DBT Secretary K. VijayRaghavan, the department has now finished drafting the Bill and “processed it further for the necessary approval”.

In the same period, 2003-2015, the Central and various state governments have toyed with the idea of collecting and storing DNA profiles. Notably, the Tamil Nadu government sought to amend the Prisoners Identification Act 1920 intending to set up a database of prisoners’ profiles. In 2012, the Uttar Pradesh government made it mandatory for the DNA profiles of dead persons to be saved along with the postmortem.

Although the draft Bill banks on an amendment to the Criminal Procedure Code made in 2005 – to allow DNA evidence to be admissible in a court – its principal and most problematic feature is the central repository it envisages of DNA profiles belonging to crime suspects, criminal offenders, missing persons, unknown deceased persons, and volunteers.

Its contents and operation will be managed by a DNA Profiling Board and a Databank Manager that the Board will appoint, who altogether have too many discretionary powers that drag the credible parts of the document down. These parts include useful mechanisms such as for post-conviction DNA-testing (where a conviction can be overturned by allowing the defendant to appeal for a DNA test).

Overall, the draft Bill has four major flaws:

  1. Reliability of DNA profiling
  2. Visible and hidden costs
  3. Privacy and anonymisation
  4. Power and sunset clauses

I. Reliability of DNA profiling

Credit: kaibara/Flickr, CC BY 2.0.
Credit: kaibara/Flickr, CC BY 2.0.

What are the chances you’ll be killed in an airline accident? There is a number ascribed to this high-cost enterprise, and it is calculated using statistics because it’s hard to estimate how the failure of one of thousands of the components constituting it will or won’t precipitate the failure of the overall entity. So, the chances that you’ll be killed in an airline accident are 1 in 4.7 million. That means if 4.7 million flights are undertaken, one of them will result in a fatal accident, right? Not exactly, because the chances of an accident could be significantly increased if certain components of an aircraft fail, and engineers are not aware of all such precipitant failures.

Analysing the DNA of an individual to look for clues about her/his identity is subject to similar stochastic caveats. This is because, despite the many unique properties of the DNA molecules in our bodies, our ability to preclude errors in indexing them isn’t perfect. The implication is that DNA profiling throws up fewer errors when validating or invalidating less systematic proof, but there are errors nonetheless that a law – and definitely a court interpreting that law – must be aware of.

Moreover, the proofs are also dependent on how rarely or often the STRs have been observed in the past. Estimates of their rarity are based on studying some preset locations on the DNA: the CODIS database of DNA profiles in the US looks at 13 locations, the NDNAD in the UK looks at 10, whereas Interpol analyses look at 12. The CDFD (Centre for DNA Fingerprinting and Diagnostics) – the nodal agency for DNA analysis in the country – plans to look at 17, according to Dr. J. Gowrishankar, its director. These locations were determined to be important in the early days of DNA forensics, and according to lawyers in the US and UK are overdue for a reexamination.

The Human DNA Profiling Bill, on the other hand, is dismissive of this aspect of the technique it is centred on, with its January 2015 draft saying in its introduction that DNA profiling can distinguish between any two people “without a doubt”. The words give the impression that the experts involved in drafting it have no reason to believe that DNA profiles could ever be fallacious. In fact, conspicuously missing from the document are the statistical procedures (performed on DNA information) that will be admissible as evidence in a court of law.

Speaking to The Wire, Gowrishankar clarified that the three words “without a doubt” had been removed from the draft Bill in a later iteration – but only because the Bill would be tabled without that part in Parliament. However, he also added that he would be able to defend the infallibility of the technique.

In 2009, New Scientist reported the case of Charles Richard Smith. Smith was convicted of a sexual assault on Mary Jackson (not her real name) in Sacramento, California, which took place in January 2006. Jackson was sitting in a parking lot when a stranger jumped into her truck and made her drive to a remote location before forcing her to perform oral sex on him. When police arrested Smith and took a swab of cells from his penis, they found a second person’s DNA mixed with his own.

Mark Henderson’s 2012 book The Geek Manifesto: Why Science Matters elaborates on what happened during Smith’s trial (p. 158):

… a forensic scientist testified that the chances that the sample did not come from Jackson were just 1 in 95,000. Smith was convicted and jailed for 25 years. Genetic evidence, however, can be analysed in multiple ways. The analyst who provided the 1 in 95,000 number was convinced that he saw reliable ‘peaks’, indicating matches, at most of the 13 places in the genome where American forensic scientists compare DNA. His supervisor, whose evidence was also presented, thought fewer of these matches were reliable, and so put the probability that the DNA wasn’t Jackson’s at 1 in 47. A subsequent review of the case used a different technique, based on a computer algorithm, to compare the likelihood of the different interpretations of the evidence advanced by the prosecution and the defence. This suggested that this pattern of evidence was only twice as likely if the DNA was Jackson’s than if it belonged to someone else.

This isn’t to say that a reliable estimate can never be arrived at, but only that the draft Bill does not have the commensurate depth required to identify and tackle the sort of statistically motivated mistakes in DNA profiling. In fact, it also abdicates itself from specifying any best practices for the collection, storage and analysis of DNA samples – while  in countries like the UK and USA, a more matured approach to DNA profiling has been instituted through laws like the DNA Identification Act 1994 (USA), the Criminal Justice and Public Order Act 1994 (UK) and the DNA Identification Act 1998 (Canada).

According to Gowrishankar, “The Bill has been drafted keeping the future in mind, so we have not included the different ways in which the information can be analysed. We want to keep our options open,” and that it was up to the defence attorneys to refute findings.

The upper hand that DNA profiling claims in being able to identify a person is bifurcated: it simultaneously relies on being similar to one set of data and being dissimilar to another. And how much a profile is closer to one and farther from the other can be interpreted in many ways – all of them reliant on a control group, a reference point based on which the analyst can say how much similarity and dissimilarity a profile exhibits. This control group is defined by a sub-database that contains the DNA profiles of volunteers. Gowrishankar said that the significance of each match (or mismatch) will be determined relative to how unique the ‘letters’ in the profiles are. As a result, the size of the volunteers’ database plays a critical role in determining the outcome of cases.

In 2007, the noted legal experts Michael Saks and James Koehler presented a problem called the individualisation fallacy that arises when examiners confuse infrequency with uniqueness – a flaw that can be eliminated (to a certain extent) only by enlarging the control, i.e. volunteers’, database. For example, if an anomalous pattern in the DNA of a person has a one-in-a-quintillion chance of occurring (based on its frequency of occurrence among the volunteers), the examiner will assert that given the population of all the people on Earth only that person’s DNA has that pattern (absolute uniqueness). However, the examiner assumes wrongly that he/she is aware of all the sources of that anomaly in human genetics (relative uniqueness). A similar mix-up between the two kinds of uniqueness results in the prosecutor’s fallacy exemplified in the infamous Sally Clark case of 1999.

Another issue that worsens reliability of results is that the draft bill doesn’t explicitly ask to regularly check if any samples have been contaminated, even if it goes to some length to talk about what will happen to those who are found damaging samples in any way. How credible those sanctions are is a different matter. In at least one high-profile human rights case, the murder of five Kashmiri civilians at Pathribal in 2000, DNA samples were tampered with in an attempt to absolve the security forces of the charge of murder. The police officer who orchestrated the tampering was never punished.

II. Visible and hidden costs

Credit: Wikimedia Commons
Credit: Wikimedia Commons

The CDFD charges Rs.5,000 for each blood sample or person and Rs.10,000 for each “forensic exhibit” – such as an item of clothing from a crime scene – and an additional 12.36% as service charge levied by the Government of India. Though the draft Bill proposes including the profiles of only those under the scanner of the criminal justice system, data from the National Crime Records Bureau shows that over 32.7 lakh people were arrested in 2012 alone on criminal charges (proven and unproven) And while Gowrishankar said the official estimates were Rs.5 crore a year for keeping the database updated, acquiring the DNA profiles alone would cost more than Rs.1,800 crore.

The number of 32.7 lakh (even if only for reference) is too bloated for the database’s purposes because it also includes persons accused of minor crimes. Even if the size of the database has to be as big as possible to minimise the effects of the individualisation fallacy, its size becomes meaningless after a point, as the British government discovered in 2008. In that year, the number of profiles on the NDNAD jumped from 1.9 million to 4.1 million but the number of cases solved by the use of DNA profiles fell by 2,632 to 17,614. This was because the 2.2 million profiles were almost entirely of people who hadn’t been charged with any offences, making their DNA profiles irrelevant when it came to comparing those picked up from crime scenes. Similarly, the draft Bill would do well to include only the profiles of those charged with serious criminal offences – comparisons would be more efficient and costs would be lower.

Next, according to GeneWatch UK: “In 2010, putting someone’s DNA profile on the database in England and Wales was estimated to cost £30 to £40 and storing one person’s DNA sample was estimated to cost £1 a year.” The CDFD analysis rates are comparable to these numbers – so it must be noted that the capital costs of setting up the database in the UK was £300 million (Rs.3,000 crore approx.). Third, there is the operational cost – to maintain the communication and security infrastructure, and ensure it is compatible with indices like the CODIS. In fact, in September 2014, the FBI and the CDFD signed an agreement to install an instance of CODIS in CDFD’s Hyderabad office and train the personnel there. However,  Gowrishankar said all of this would warrant only Rs.20 crore.

None of these expenses are mentioned in the draft Bill.

III. Privacy and anonymisation

Credit: home_of_chaos/Flickr, CC BY 2.0.
Credit: home_of_chaos/Flickr, CC BY 2.0.

A person’s DNA profile contains similar information as a person’s password – however, it is more visceral. In the mammoth spatial configuration of the DNA’s atoms is encoded many of our characteristics and personal tendencies – including colour, race, behavioural features and susceptibility to some diseases. However, the few of the three million positions that the CODIS, NDNAD or the CDFD will be looking at are considered “neutral” – they don’t codify any of our features that might give our identities away, so it’s safe to store them without being anxious about what the government is finding out about us. That’s what Gowrishankar says, too, and that only information of those 17 positions that the CDFD will consider will be stored in the database.

However, this information is missing in the draft Bill, giving the impression that non-neutral information from people’s DNA profiles will be stored as well – and sans any safeguards beyond the Bill itself, like the USA has the Genetic Information Nondiscrimination Act 2008. Gowrishankar said that the Bill omitted this detail because some advancement in the future could require analysing more than 17 neutral positions, or fewer, or others altogether, and that if the Bill had been specific to that extent, it would have to be modified over and over again to keep up with the times. Be that as it may, the draft Bill in its current form neither withholds the database from holding distinctly personal information nor does it acknowledge that possibility.

In that context, the information should be accorded the same rights that information on the Internet, or anywhere else, is if not more. First, a person should be able to appeal the inclusion of her DNA profile in the database – although Gowrishankar insisted no profile could mistakenly enter the database as it would require either a court order or an expression of consent to get there. Second, the person should be able to access her/his own DNA profile whenever the need arises through appropriate legal channels – which he said wouldn’t be possible at all. Third, the person whose profile is under scrutiny should be able to know how the information contained is being used and why, and to ascertain its deletion when due. These three rights are missing in the draft bill.

Moreover, in a separate note, the committee says,

The Expert Committee also discussed and emphasised that the Privacy Bill is being piloted separately by the Government. That Bill will override all the other provisions on privacy issues in the DNA Bill.

But even as the draft DNA-profiling bill seeks to deflect the responsibility of securing privacy to the Privacy Bill, aReport of the Group of Experts on Privacy, Chaired by Justice A.P. Shah (former Chief Justice of the Delhi High Court), explicitly set out the missing privacy and security provisions in October 2012, and a majority of them remain unresolved or unaddressed. By neglecting them, the CDFD and the DNA Profiling Board run the risk of turning themselves opaque and, for all practical purposes, unaccountable. For example, the draft Bill does not:

  1. Provide a notice that DNA samples were collected from so-so areas of the body
  2. Inform anybody – particularly the individual – if and when her/his DNA is contaminated, misplaced or stolen
  3. Inform a person if a case involving her/his DNA is pending, ongoing or closed
  4. Inform the people when there are changes in how their DNA is going to be accessed, or if the way their DNA is being stored or used is changed
  5. Distinguish between when DNA can be collected with consent and when it can’t
  6. Say how volunteers can contribute their DNA to the database even though the draft Bill has a provision for voluntary submissions
  7. Provide any explicit guarantee that the collected DNA won’t be used for anything other than circumstances specified in the Bill
  8. Specify when doctors or the police can or can’t access DNA profiles

Without these protections, the DNA profiles could be collected for one purpose but end up being used for something else. Consider #7 – the draft Bill doesn’t aspire to be self-contained and leaves itself open to expanding in the future. At one point (Sec. 31(4)), it spells out the various indices according to which profiles in the database will be stored:

Every DNA Data Bank shall maintain following indices for various categories of data, namely:

(a) a crime scene index;
(b) a suspects’ index;
(c) an offenders’ index;
(d) a missing persons’ index;
(e) unknown deceased persons’ index;
(f) a volunteers’ index; and
(g) such other DNA indices as may be specified by Regulations.

Why bother to specify any of the indices at all if the committee has (g)? And without specifying what regulations those could be and who, apart from the DNA Profiling Board, has the authority to spell them out, the draft Bill signals it could just about bring anyone’s DNA profiles into the database.

Additionally, who will watch the watchmen? The DNA Profiling Board is tasked – rather tasks itself – with determining which DNA profiles enter the database, who gets to access them, and how the database will be organised and maintained, in effect establishing a low quality check over itself. Although Gowrishankar clarified that there would be a Parliamentary check on the Board’s activities and that Parliament would be the ultimate arbiter for all “major” issues arising due to the Bill, there is still a lack of supervision – and potential for abuse – in the day-to-day dispensation of duties. If the Human DNA Profiling bill has to be effective and honest, it must account for the privacy shortcomings described by the Group of Experts.

Another concern is anonymisation – the process through which information contained in DNA profiles can’t be used to retrace the individuals from whom they were acquired. There is no description of a form or application of any kind that the draft Bill expects to be submitted along with the materials containing human DNA. If the Bill expects to use the form currently being used by the CDFD, there is an anomaly: the CDFD form asks for the applicant to mention her caste. Even if the draft Bill doesn’t explicitly mention that the database will have a ‘caste’ column, being able to associate an application form with a sample – and therefore ‘its caste’ – is plausible, especially in the volunteers’ database.

More troublingly, Section 31(6)(a) states that a DNA profile in the database will bear the identity of its source if its source is an offender, and that (b) all other DNA profiles will be relatable with the case reference number. The problem is that the case reference is not anonymised with respect to the people involved in the case.

IV. Power and sunset clauses

Credit: manoftaste-de/Flickr, CC BY 2.0.
Credit: manoftaste-de/Flickr, CC BY 2.0.

The DNA Profiling Board overseeing the implementation of the bill (when enacted) has given itself, and the bill, some conflicting rules and powers that together result in ambiguity about the scope of the bill and its accountability. Some examples:

Conflicts of interest – Section 12(k) states that the board is responsible for “making recommendations for maximising the use of DNA techniques and technologies in administration of justice”. Then, throughout the bill, the board’s powers are also detailed as extending to specifying the rules for how DNA information is collected and secured. Put them together and the board’s essentially saying, “We’ll try to use DNA evidence for as many things as possible, we’ll decide how the information is collected for those purposes, and we’ll decide how we’ll use it.”

Ex post facto implication – Section 13 states that any laboratory that wishes to undertake human DNA-profiling must get prior consent from the board. Then, Section 14(2) allows any DNA laboratory that’s in existence at the time the bill is enacted to perform human DNA profiling without prior approval from the board.

Use of profiles – Section 39(g) states that “Information relating to DNA profiles, DNA samples and records relating thereto shall be made available” to a slew of judicial and executive agencies as well as “for any other purposes, as may be prescribed”. However, those prescriptions have not been detailed in the Bill, and appear to be at the discretion of the DNA Profiling Board. In fact, Section 39(e) states that the profiles, and “samples and records relating thereto”, may be used for creating a “population statistics” database. This is to facilitate population-wide studies of genetic characteristics, and in the absence of perfect anonymisation, could potentially become associated with caste data.

Moreover, Section 35(2), which deals with the communication of DNA profiles to foreign states and institutions, doesn’t limit it to offenders and convicts but, by not discussing it in detail, allows for any profile in the database to be shared. Put this together with an individual’s inability to appeal the inclusion of her/his profile, and anyone’s profile – as long as it has wound its way into the database – can be shared with foreign entities. There are also no restrictions on if the foreign agencies can index the profile in another database.

Legal recourse after three months – Someone who’s been wronged by any of the provisions of the bill can approach a court only if he/she approaches the board first and gives it three months to act on a complaint. In those three months or before that, Section 57(1) of the bill prevents anyone from approaching the courts except the central government or a member of the board itself.

Finally, there’s the absence of a sunset clause – especially when its provisions will expire, and if there is a period after which a DNA profile will be removed from the database. For the latter, the draft Bill specifies that if a person has been acquitted in a case or if the case is set aside, the corresponding profile will be deleted, but nothing is said about the profiles of missing persons who have been identified, volunteers who have died, and other profiles that are likely to be collected at crime scenes. Moreover, no rationale is presented for retaining the profiles of those who are convicted of offences like rape or murder, who end up spending long years or a lifetime in prison. While Gowrishankar asserted that only the DNA profiles of the unidentified dead would be held forever, the draft Bill does not explicitly exclude the rest.

Given the scale of issues with the draft Bill, and its potentially disastrous sidelining of privacy concerns, its scheduled introduction in the monsoon session of the Lok Sabha seems hurried – despite having first been mooted more than a decade ago. Some of the issues may have escaped the drafting committee’s concerns by way of not having received appropriate feedback – such as the issue of hidden costs – but the committee must explain why there is a lack of access to data of the people by the people, why there are no sound anonymisation protocols, and why there are insufficient self-regulation and protection measures.

Download an annotated copy of the Human DNA Profiling Bill draft here (PDF).

The Wire
July 24, 2015

The net can’t be neutral if regulators are biased against voice-over Internet

The Department of Telecommunications’ new report (PDF) on net neutrality is a deceptive piece of work. Drawn up by a committee set up in January 2015, it was perceived as a reaction to the Telecom Regulatory Authority of India’s consultation paper in May on implementing the principles of net neutrality in Indian telecom regulation. While the TRAI document triggered a controversy by appearing ambiguous about its intentions, the DoT report presents half-measures with the aim of creating a level playing field for telecom service providers (TSPs).

What are the more contentious issues in the DoT report?

On the question of regulating domestic VoIP calls made through over-the-top (OTT) services like WhatsApp, Viber and Skype, the report appears confused. The committee writes that while app-to-app calls made internationally should be unregulated, local calls ought to be regulated by the issuance and revocation of licenses. This has prompted the Indian telecom industry to ask why the DoT wants to spare international VoIP calls. When you call your sibling in Princeton from an Airtel number in Chennai, the TSP that picks up and relays the call in the New Jersey area takes the bulk of the fee you pay to make the call – not Airtel. Likely because of this arrangement, the DoT is not concerned about regulating international VoIP calls. Of course, acknowledging the need for regulatory balance on the one hand but ignore it on the other lays it open to the charge of  double-standards.

However, given that it wants to regulate domestic VoIP calls in the interest of creating what it thinks is a level playing field within the country, why not protect a level playing field among TSPs abroad as well? Moreover, WhatsApp, Skype and Viber are all foreign companies and data made via their apps could be routed through foreign servers – further lightening the distinction between domestic and international calls.

Another issue is centred on zero-rating, the practice of ISPs routing some traffic through the network at subsidised rates based on its sources. The DoT committee writes, “if government wants to give services free on the internet (like zero rating), it is considered as positive discrimination and not seen as violation of Net Neutrality. Therefore, it should be permitted in public interest. Government can provide zero rated channels to citizens for essential services (public interest zero rating), based on clear public policy and principles and on non-commercial terms.”

Because zero-rated traffic is qualified based on agreements entered into with the TSPs/ISPs, small players who can’t afford it perceive it as ‘negative discrimination’. However, the DoT report retorts that such agreements will be overseen by anti-competitive laws and that their ability to participate in the ecosystem will be protected by those laws. Overall, in fact, the report also encourages passing on the burden of accomplishing zero-rating’s goals, such as increasing access to the Internet among the masses, to solutions like free Wifi and providing vouchers paid for by the government.

Will regulating VoIP be a direct violation of net neutrality?

There is a difference between making licenses the minimum requirement to operate a business and using licenses to regulate a business. The big bullet was going to be whether OTT and OTT-VoIP services would be banned if they didn’t come under a licensing regime. This is no longer going to be the case for OTT-application services (like Facebook), but for OTT-communications services, the DoT committee is resurrecting an argument that has been around for years.

For example, in November 2014, it was at TRAI’s behest that Skype suspended its app-to-phone calling service. One of the demands then – when the debate on net neutrality hadn’t yet kicked off in India – was not to suspend OTTs but to subject them to the same regulatory bindings that applied to TSPs. Vodafone Essar’s T.V. Ramachandran had told NDTV, “We can do a lot more if a level playing field is given to us”.

And within the limitations of net neutrality, one possible way out of the Gordian knot now is to extend restrictions on features like call-switching and do-not-disturb to OTTs as well. This could even out the regulatory imbalance as well as encourage innovation in the sector, but inevitably also impose some costs on WhatsApp that could be transferred to subscribers.

Such a regulatory environment would be similar to the one in the US, where the Federal Communications Commission doesn’t regulate VoIP calls but requires compliance with a set of law enforcement rules, contributions to a fund that pays for “communication services in high-cost areas”, maintenance of call-records, providing local number portability, and providing special services for people with speech or hearing disabilities. Some other countries – like the UK and Italy – don’t regulate VoIP either but allow differential pricing. In fact, should TRAI and DoT agree on using licenses to regulate OTTs’ domestic VoIP services, India will become the first country to do so.

Why is regulation necessary?

Whether any of these should be regulated at all defers to a deeper conflict: the advent of VoIP is a consequence of technology’s natural disruptive capacity that’s so prized by entrepreneurs, while on the other hand it is perceived as an unfair form of arbitrage by those already invested in the sector. For example, a VoIP call costs 12.5 times less than a call made via a TSP on average – while TSPs like Airtel, Vodafone and others have already sunk in Rs.7.5 lakh crore to develop infrastructure that furthers their business in the country.

An intervening regulator has to decide whom to side with, the consumer or the investor, and he is answerable to both (Is this what prompted the DoT’s recommendation to regulate domestic VoIP calls but not text-messages?).

The answer isn’t straightforward from anybody’s perspective. Because of their low costs, TRAI could let VoIP calls remain unregulated and relax the tariff scheme for TSPs/ISPs to compete with the OTTs. On the other hand, TSPs/ISPs incur significant infrastructural costs when expanding into new areas and have to make that up. And finally, while TRAI can regulate local players but not OTTs like Facebook and WhatsApp that are based abroad, the DoT suggests it can regulate VoIP calls through a licensing regime – which can be long-winded and arbitrary.

(With inputs from Anuj Srivas.)

The Wire
July 21, 2015

Money for science

Spending money on science has been tied to evaluating the value of spin-offs, assessing the link between technological advancement and GDP, and dissecting the metrics of productivity, but the debate won’t ever settle no matter how convincingly each time it is resolved.

For a piece titled The Telescope of the 2030s, Dennis Overbye writes in The New York Times,

I used to think $10 billion was a lot of money before TARP, the Troubled Asset Relief Program, the $700 billion bailout that saved the banks in 2008 and apparently has brought happy days back to Wall Street. Compared with this, the science budget is chump change, lunch money at a place like Goldman Sachs. But if you think this is not a bargain, you need look only as far as your pocket. Companies like Google and Apple have leveraged modest investments in computer science in the 1960s into trillions of dollars of economic activity. Not even Arthur C. Clarke, the vaunted author and space-age prophet, saw that coming.

Which is to say that all that NASA money — whether for planetary probes or space station trips — is spent on Earth, on things that we like to say we want more of: high technology, education, a more skilled work force, jobs, pride in American and human innovation, not to mention greater cosmic awareness, a dose of perspective on our situation here among the stars.

And this is a letter from Todd Huffman, a particle physicist at Oxford, to The Guardian:

Simon Jenkins parrots a cry that I have heard a few times during my career as a research scientist in high-energy physics (Pluto trumps prisons when we spend public money, 17 July). He is unimaginatively concerned that the £34m a year spent by the UK at Cern (and a similar amount per year would have been spent on the New Horizons probe to Pluto) is not actually money well spent.

Yet I read his article online using the world wide web, which was developed initially by and for particle physicists. I did this using devices with integrated circuits partly perfected for the aerospace industry. The web caused the longest non-wartime economic boom in recorded history, during the 90s. The industries spawned by integrated circuits are simply too numerous to count and would have been impossible to predict when that first transistor was made in the 50s. It is a failure of society that funnels such economic largesse towards hedge-fund managers and not towards solving the social ills Mr Jenkins rightly exposes.

Conflict of interest? Not really. Science is being cornered from all sides and if anyone’s going to defend its practice, it’s going to be scientists. But we’re often so ready to confuse participation for investment, and at the first hint of any allegation of conflict, don’t wait to verify matters for ourselves.

I’m sure Yuri Milner’s investment of $100 million today to help the search for extra-terrestrial intelligence will be questioned, too, despite Stephen Hawking’s moving endorsement of it:

Somewhere in the cosmos, perhaps, intelligent life may be watching these lights of ours, aware of what they mean. Or do our lights wander a lifeless cosmos — unseen beacons, announcing that here, on one rock, the Universe discovered its existence. Either way, there is no bigger question. It’s time to commit to finding the answer – to search for life beyond Earth. We are alive. We are intelligent. We must know.

Pursuits like exploring the natural world around us are, I think, what we’re meant to do as humans, what we must do when we can, and what we must ultimately aspire to.

DoT backs net neutrality but wants end to free domestic Skype, WhatsApp calls

The Wire
July 17, 2015

It’s just good business. Credit: balleyne/Flickr, CC BY 2.0.
It’s just good business. Credit: balleyne/Flickr, CC BY 2.0.

A Department of Telecommunications committee has released a report on the issue of net neutrality, following the controversial policy consultation paper that the Telecom Regulatory Authority of India put out in May. The report falls in life with many of the popular demands that surged on social media following the TRAI paper, and includes this telling line: “The Committee is of the view that the statement of [telecom companies] that they are under financial stress due to the rapidly falling voice revenues and insufficient growth in data revenues, is not borne out by evaluation of financial data.”

At the same time, it also tucks in a potentially controversial suggestion that could rekindle debate: of regulating domestic calls made through VoIP-enabled over-the-top (OTT) services like WhatsApp and Viber through the Telegraph Act, while leaving alone international calls made through the same apps. It remains to be seen how many of the report’s recommendations TRAI will adopt.

One of the more contentious topics in the the TRAI paper was if OTT services like Facebook and WhatsApp, called so because they rely on local Internet service providers to relay data between their applications and users, should be regulated in India. The DoT report states that non-VoIP OTTs, as well as application-based services like Uber and Ola Cabs, won’t be regulated. VoIP stands for voice-over Internet Protocol, the use of an Internet connection to make phone calls.

Strangely, the report marks a distinction between domestic calls made through VoIP OTTs and international VoIP OTTs, and recommends that only the former be regulated. The ostensible reason for this is that the DoT wants to protect the revenues of telecom companies and, possibly, doesn’t want to interfere with the millions of middle-class Indians who keep in touch with their sons and daughters abroad. But no explicit reason for this differentiation has been provided. In fact, as Pranesh Prakash of the Centre for Internet and Society pointed out on Twitter, the DoT’s suggested use of licenses to regulate such VoIP OTTs isn’t a net-neutrality issue in the first place.

Beyond this sore point: the report also examines how – and how not to – examine data packets flowing through the ‘pipes’, or connections between nodes, of the Internet, and expressly rules out the illegal use of deep packet inspection. Deep packet inspection is a technique often used on networks to eavesdrop on data as it passes through a pipe. The document has also been courageous enough to admit that not all zero-rating plans “are controversial or against the net neutrality principles”. Zero-rating is akin to a toll-gate within a pipe which allows data of some forms or originating from certain sources to pass through without a fee while taxing the rest. Such implementations could be useful when providing government services – like railway bookings – for cheap to the rural poor, but at the same time would have to be protected from non-competitive uses by private enterprises.

Thus, the report recommends “the incorporation of a clause in the license conditions of TSP/ISPs that will require the licensee to adhere to the principles and conditions of Net Neutrality specified by guidelines issued by the licensor from time to time”.

Beyond the questions surrounding net neutrality, the report also takes a stand on India’s digital sovereignty, taking cognisance of the fact that “there is a need for a balance to be drawn to retain the country’s ability to protect the privacy of its citizens and data protection without rendering it difficult for business operations”. It goes on to suggest that the TRAI could “identify critical and important areas through public consultations” when the question of hosting data locally – in servers as well as pipes physically located in the country – arises. Now, the ball is decidedly in TRAI’s court, and it would be unfair to say the body isn’t under pressure to implement what appears to be an amenable report from the DoT.

Of small steps and giant leaps of collective imagination

The Wire
July 16, 2015

Is the M5 star cluster really out there? Credit: HST/ESA/NASA
Is the M5 star cluster really out there? Credit: HST/ESA/NASA

We may all harbour a gene that moves us to explore and find new realms of experience but the physical act of discovery has become far removed from the first principles of physics.

At 6.23 am on Wednesday, when a signal from the New Horizons probe near Pluto reached a giant antenna in Madrid, cheers went up around the world – with their epicentre focused on the Applied Physics Laboratory in Maryland, USA.

And the moment it received the signal, the antenna’s computer also relayed a message through the Internet that updated a webpage showing the world that New Horizons had phoned home. NASA TV was broadcasting a scene of celebration at the APL and Twitter was going berserk as usual. Subtract these instruments of communication and the memory of humankind’s rendezvous with Pluto on the morning of July 15 (IST) is delivered not by the bridge of logic but a leap of faith.

In a memorable article in Nature in 2012, the physicist Daniel Sarewitz made an argument that highlighted the strength and importance of good science communication in building scientific knowledge. Sarewitz contended that it was impossible for anyone but trained theoretical physicists to understand what the Higgs boson really was, how the Higgs mechanism that underpins it worked, or how any of them had been discovered at the Large Hadron Collider earlier that year. The reason, he said, was that a large part of high-energy physics is entirely mathematical, devoid of any physical counterparts, and explores nature in states the human condition could never physically encounter.

As a result, without the full knowledge of the mathematics involved, any lay person’s conviction in the existence of the Higgs boson would be punctured here and there with gaps in knowledge – gaps the person will be continuously ignoring in favour of the faith placed in the integrity of thousands of scientists and engineers working at the LHC, and in the comprehensibility of science writing. In other words, most people on the planet won’t know the Higgs boson exists but they’ll believe it does.

Such modularisation of knowledge – into blocks of information we know exist and other blocks we believe exist – becomes more apparent the greater the interaction with sophisticated technology. And paradoxically, the more we are insulated from it, the easier it is to enjoy its findings.

Consider the example of the Hubble space telescope, rightly called one of the greatest astronomical implements to have ever been devised by humankind.

Its impressive suite of five instruments, highly polished mirrors and advanced housing all enable it to see the universe in visible-to-ultraviolet light in exquisite detail. Its opaque engineering is inaccessible to most but this gap in public knowledge has been compensated many times over by the richness of its observations. In a sense, we no longer concern ourselves with how the telescope works because we have drunk our fill with what it has seen of the universe for us – a vast, multihued space filled with the light of a trillion stars. What Hubble has seen makes us comfortable conflating belief and knowledge.

The farther our gaze strays from home, the more we will become reliant on technology that is beyond the average person’s intellect to comprehend, on rules of physics that are increasingly removed from first principles, on science communication that is able to devise cleverer abstractions. Whether we like it or not, our experience, and memory, of exploration is becoming more belief-ridden.

Like the Hubble, then, has New Horizons entered a phase of transience, too? Not yet. Its Long-Range Reconnaissance Imager has captured spectacular images of Pluto, but none yet quite so spectacular as to mask our reliance on non-human actors to obtain them. We know the probe exists because the method of broadcasting an electromagnetic signal is somewhat easily understood, but then again most of us only believe that the probe is functioning normally. And this will increasingly be the case with the smaller scales we want to explore and the larger distances we want to travel.

Space probes have always been sophisticated bits of equipment but with the Internet – especially when NASA TV, DSN Now and  Twitter are the prioritised channels of worldwide information dissemination – there is a perpetual yet dissonant reminder of our reliance on technology, a reminder of the Voyager Moment of our times being a celebration of technological prowess rather than exploratory zeal.

Our moment was in fact a radio signal reaching Madrid, a barely romantic event. None of this is a lament but only a recognition of the growing discernibility of the gaps in our knowledge, of our isolation by chasms of entangled 1s and 0s from the greatest achievements of our times. To be sure, the ultimate benefactor is science but one that is increasingly built upon a body of evidence that is far too specialised to become something that can be treasured equally by all of us.

Instead of reaching the sky, Aakash ends up six feet below

The Wire
July 15, 2015

Once at the centre of the Indian government’s half-baked schemes to make classrooms tech-savvy, the Aakash project wound down quietly in March 2015, an RTI has revealed. The project envisaged lakhs of school and engineering students armed with a tablet each, sold at Rs.1,130 courtesy the government, from which they partake of their lessons, access digitised textbooks and visualise complicated diagrams. Lofty as these goals were, the project was backed by little public infrastructure and much less coordination, resulting in almost no traction despite being punctuated regularly with PR ops.

The project was conceived in 2011 by the UPA-2 government to parallel the One Laptop Per Child program, forgetting conveniently that the latter worked only in small Uruguay and for unique reasons. Anyway, a British-Canadian company named DataWind was contracted to manufacture the tablets, which the government would then purchase for Rs.2,263 and subsidise so as to retail them at Rs.1,130.

However, the second version, whose development was led by IIT-Bombay and the Centre for Development of Advanced Computing, released in November 2012 bordered on the gimmicky. It had 512 MB RAM, a 7” screen, a 1 GHz processor and, worst of all, a battery that lasted all of three hours even as a full day at school typically spanned seven. Even so, the government announced that 50 lakh such tablets would be manufactured and that 1 lakh teachers would be trained to use it. In March 2013, then Union HRD Minister Pallam Raju called it the government’s “dream project”.

But what really crippled the program was not the operational delays or logistical failures but the Central government’s lackadaisical assumption that placing a tablet in a student’s hands would solve everything. For example, it was advertised that Aakash would be a load off children’s backs, eliminating the need to lug around boatloads of books. However, the NCERT didn’t bother to explain which textbooks would be digitised first – or at all – and when they’d be available. Similarly, the low-income households whose younger occupants the tablets targeted didn’t have access to regular electricity let alone an Internet connection. What the tablets would ultimately do was become, for those who couldn’t afford to maintain and use them, a burden.

The Aakash train on the other hand was on rails of its own. By November 2011, DataWind had shipped 6,440 devices but only 650 were found good enough to sell. Nonetheless, in January 2013, IIT-Bombay announced it was starting work on Aakash 3, and by July the same year had skipped to working on the fourth iteration. Then, in September 2013, the CAG alleged that IIT-Rajasthan, which had handled the Aakash project in 2011, had been awarded the project arbitrarily, received Rs.47.42 crore without any prior feasibility checks, and overran its budget by Rs.1.05 crore. However, this did nothing to slow things down.

The biggest beneficiary was DataWind, the air in its bellows blown by the Central government’s fantasy of arming itself with the same cargo that Western institutions sported. Between December 2013 and July 2014, the company was able to announce three new models in the Rs.4,000-7,000 price range, introduce one for the UK priced at ₤30, raised Rs.168 crore in an IPO, listed on the Toronto Stock Exchange and got on the MIT Tech Review’s 50 smartest tech companies of 2014 list for breaking “the price barrier”.

The RTI application that revealed Aakash had been wound down also received the reply that the project had achieved all its objectives: of procuring one lakh devices, testing them and establishing 300 centres in engineering colleges – speaking nothing of the more ostentatious goal of linking 58.6 lakh students across 25,000 colleges and 400 universities through an ‘e-learning’ program. The reply also stated that specifications for a future device had been submitted to the MHRD. Whether the project will be revived by the ruling BJP government later is unknown.

And on that forgettable note of uncertainty, one of the more misguided digital-India schemes comes to a close.

Yoichiro Nambu, the silent revolutionary of particle physics, is dead

The Wire
July 18, 2015

Particle physics is an obscure subject for most people but everyone sat up and took notice when the Large Hadron Collider discovered the particle named after Peter Higgs in 2012. The Higgs boson propelled his name to the front pages of newspapers that until then hadn’t bothered about the differences between bosons and fermions. On the other hand, it also validated a hypothesis he and his peers had made 50 years ago and helped the LHC’s collaborations revitalise their outreach campaigns.

However, much before the times of giant particle colliders – in the late 1950s, in fact – a cascade of theories was being developed by physicists the world over with much less fanfare, and a lot more of the quiet dignity that advanced theoretical physics is comfortable revelling in. It was a silent revolution, and led in part by the mild-mannered Yoichiro Nambu, who passed away on July 5, 2015.

His work and its derivatives gave rise to the large colliders like the LHC at work today, and which might well have laid the foundations of modern particle physics research. Moreover, many of his and his peers’ accomplishments are not easily discussed the way political movements are nor do they aspire to such privileges, but that didn’t make them any less important than the work of Higgs and others.

Yoichiro Nambu also belonged to a generation that marked a resurgence in Japanese physics research – consider his peers: Yoshio Nishina, Masatoshi Koshiba, Hideki Yukawa, Sin-Itiro Tomonaga, Leo Esaki, Makoto Kobayashi and Toshihide Maskawa, to name a few. A part of the reason was a shift in Japan’s dominant political attitudes after the Second World War. Anyway, the first of Nambu’s biggest contributions to particle physics came in 1960, and it was a triumph of intuition.

There was a span of 46 years between the discovery of superconductivity (by Heike Kamerlingh Onnes in 1911) and the birth of a consistent theoretical explanation for it (by John Bardeen, Leon Cooper and John Schrieffer in 1957) because the phenomenon seemed to defy some of the first principles of the physics used to understand charged particles. Nambu was inspired by the BCS theory to attempt a solution for the hierarchy problem – which asks why gravity, among the four fundamental forces, is 1032 times weaker than the strongest strong-nuclear force.

With the help of British physicist Jeffrey Goldstone, Nambu theorised that whenever a natural symmetry breaks, massless particles called Nambu-Goldstone bosons are born under certain conditions. The early universe, around 13.75 billion years ago when it was extremely small, consisted of a uniform pond of unperturbed energy. Then, the pond was almost instantaneously heated to a temperature of 173 billion Suns, when it broke into smaller packets called particles. The symmetry was (thought to be) spontaneously broken and the event was called the Big Bang.

Then, as the universe started to cool, these packets couldn’t reunify into becoming the pond they once made up, evolving instead into distinct particles. There were perturbations among the particles and the resultant forces were mediated by what came to be called Nambu-Goldstone bosons, named for the physicists who first predicted their existence.

Yoichiro in Nambu in 2008. Source: University of Chicago
Yoichiro in Nambu in 2008. Source: University of Chicago

Nambu was able to use the hypothetical interactions between the Nambu-Goldstone bosons and particles to explain how the electromagnetic force and the weak nuclear force (responsible for radioactivity) could be unified into one electroweak force at higher temperatures, as well as how where the masses of protons and neutrons come from. These were (and are) groundbreaking ideas that helped scientists make sense of the intricate gears that turned then to make the universe what it is today.

Then, in 1964, six physicists (Higgs, Francois Englert, Tom Kibble, Gerald Guralnik, C.R. Hagen, Robert Brout) postulated that these bosons interacted with an omnipresent field of energy – called the Higgs field – to give rise to the strong-nuclear, weak-nuclear (a.k.a. weak) and electromagnetic forces, and the Higgs boson. And when this boson was discovered in 2012, it validated the Six’s work from 1964.

However, Nambu’s ideas – as well as those of the Six – also served to highlight how the gravitational force couldn’t be unified with the other three fundamental forces. In the 1960s, Nambu’s first attempts at laying out a framework of mathematical equations to unify gravity and the other forces gave rise to the beginnings of string theory. But in the overall history of investigations into particle physics, Nambu’s work – rather, his intellect – was a keystone. Without it, the day theorists’ latinate squiggles on paper could’ve become prize-fetching particles in colliders would’ve been farther off, the day we made sense of reality farther off, the day we better understood our place in the universe farther off.

The Osaka City University, where Nambu was a professor, announced his death on July 17, due to an acute myocardial infarction. He is survived by his wife Chieko Hida and son John. Though he was an associate professor at Osaka from 1950 to 1956, he visited the Institute for Advanced Study at Princeton in 1952 to work with Robert Oppenheimer (and meet Albert Einstein). Also, in 1954, he became a research associate at the University of Chicago and finally a professor there in 1958. He received his American citizenship in 1970.

Peter Freund, his colleague in Chicago, described Nambu as a person of incredible serenity in his 2007 book A Passion for Discovery. Through the work and actions of the biggest physicists of the mid-19th century, the book fleshes out the culture of physics research and how it was shaped by communism and fascism. Freund himself emigrated from Romania to the US in the 1960s to escape the dictatorial madness of Ceausescu, a narrative arc that is partially reflected in Nambu’s life. After receiving his bachelor’s degree from the University of Tokyo in 1942, Nambu was drafted into the army and witnessed the infamous firebombing of Tokyo and was in Japan when Hiroshima and Nagasaki were bombed.

The destructive violence of the war that Nambu studied through is mirrored in the creative energies of the high-energy universe whose mysteries Nambu and his peers worked to decrypt. It may have been a heck of a life to live through but the man himself had only a “fatalistic calm”, as Freund wrote, to show for it. Was he humbled by his own discoveries? Perhaps, but what we do know is that he wanted to continue doing what he did until the day he died.

What you need to know about the Pluto flyby

The Wire
July 14, 2015

In under seven hours, the NASA New Horizons space probe will flyby Pluto at 49,900 km per hour, from a distance of 12,500 km. It’s what the probe set out to do when it was launched in January 2006. The flyby will allow it to capture high-resolution images of the dwarf planet’s surface and atmosphere as well as take a look at its biggest moon, Charon. For much of the rest of the day, it will not be communicating with mission control as it conducts observation. The probe’s Long-Range Reconnaissance Imager (LORRI) has already been sending better and better pictures of Pluto as it gets closer. During closest approach, Pluto will occupy the entire field of view of LORRI to reveal the surface in glorious detail.

Fourteen minutes into the Pluto flyby, New Horizons will make its closest approach to Charon, which is about 24,000 km away. Next: 47 minutes and 28 seconds after the Charon flyby, the probe will find itself in Pluto’s shadow where its high-gain antennae will make observations of how the dwarf planet’s atmosphere affects sunlight and radio signals from Earth as they pass through it. Then, 1 minute and 2 seconds after that, New Horizons will again be in sunlight. Finally, 1 hour and 25 minutes later, it will be in Charon’s shadow to look for its atmosphere.

That New Horizons survived the flyby will be known when, on early Wednesday morning (IST), it starts to send communication signals Earthward again. The timings of various events announced by NASA will have to be adjusted against the fact that New Horizons is 4.5 light-hours away from Earth. NASA has called for a press conference to release the first close-up images at 0030 hrs on July 16 (IST). The entire data snapped by the probe during the flyby will be downloaded over a longer period of time. According to Emily Lakdawalla,

Following closest approach, on Wednesday and Thursday, July 15 and 16, there will be a series of “First Look” downlinks containing a sampling of key science data. Another batch of data will arrive in the “Early High Priority” downlinks over the subsequent weekend, July 17-20. Then there will be a hiatus of 8 weeks before New Horizons turns to systematically downlinking all its data. Almost all image data returned during the week around closest approach will be lossily compressed — they will show JPEG compression artifacts. Only the optical navigation images are losslessly compressed. [All dates/times in EDT]

Downloading the entire science dataset including losslessly compressed observations will take until around November 2016 to complete. Until then, the best will always be yet to come. As always, all communications will be via the Deep Space Network – whose Goldstone base is currently all ears for the probe.

DSN Now. Source: Screengrab
DSN Now. Source: Screengrab

Incidentally, the ashes of the astronomer Clyde Tombaugh, who discovered Pluto in 1930, are onboard New Horizons.

What do we know about Pluto?

Among the last images taken by LORRI before the flyby revealed a strange geology on Pluto. Scientists noted dark and bright polygonal patches (in the shape of a whale and a <3, respectively) as well as what appeared to be ridges, cliffs and several impact craters. However, these features on the side of Pluto facing New Horizons as it flies in. During the flyby, it will image the other side of Pluto, where these features may not be present. The probe can’t hang around to wait to see the other side either because Pluto rotates once every 6.4 Earth-days.

An annotated image of Pluto snapped by the New Horizons probe. Credit: Applied Physics Lab/NASA
An annotated image of Pluto snapped by the New Horizons probe. Credit: Applied Physics Lab/NASA

During the flyby, images of Charon will also be taken. Already, the probe has revealed that, like Pluto, the moon also has several intriguing features – while until recently both bodies were thought to be frozen and featureless balls of ice and rock – like giant craters and chasms. In fact, NASA noted one crater near Charon’s south pole, almost 100 km wide and another on Pluto, some 97 km wide, both appearing to have been the result of recent impacts (in the last billion years). The particularly dark appearance of the Charon crater has two theories to explain it. Either the ice at its bottom is of a different kind than the usual and is less reflective or the ice melted during impact and then refroze into larger, less bright grains.

An annotated image of Charon snapped by the New Horizons probe. Credit: Applied Physics Lab/NASA
An annotated image of Charon snapped by the New Horizons probe. Credit: Applied Physics Lab/NASA

All these details will be thrown up in detail during New Horizons’ flyby. They will reveal how the two bodies evolved in the past, the structure and composition of their interiors, and if – for some astronomers – Charon might’ve harboured a subsurface ocean in its past. Complementarily, NASA will also be training the eyes of its Cassini, Spitzer and Keplerspace-borne instruments on Pluto. Cassini, from its orbit around Saturn, will take a picture of New Horizons just around the time of its flyby. From July 23 to July 30, the Spitzer Space Telescope will study Pluto in the infrared, mapping its surface ice. Then, in October, the exoplanet-hunting Kepler telescope, in its second avatar as K2, will start focusing on the changes in brightness off of and around Pluto to deduce the body’s orbital characteristics.

Then, there are also post-flyby missions whose results, when pieced together with the July 14 flyby and other observations, will expand our knowledge of Pluto in its larger environment: among the Kuiper Belt, at whose inner edge it resides.

Finally, as Dennis Overbye of The New York Times argued in a poignant essay, the Pluto flyby marks the last of the Solar System’s classical planets to explored, the last of the planets the people of our generation will get to see up close. The next frontiers in planetary exploration will be the exoplanets – the closest of which is 4.3 light-years away (orbiting Alpha Centauri B). But until then, be willing to consider the Solar System’s moons, missions to which are less than a decade away. Leaving you with Overbye’s words:

Beyond the hills are always more hills, and beyond the worlds are more worlds. So New Horizons will go on, if all goes well, to pass by one or more of the cosmic icebergs of the Kuiper belt, where leftovers from the dawn of the solar system have been preserved in a deep freeze extending five billion miles from the sun…

But the inventory of major planets — whether you count Pluto as one of those or not — is about to be done. None of us alive today will see a new planet up close for the first time again. In some sense, this is, as Alan Stern, the leader of the New Horizons mission, says, “the last picture show.”

Physicists find exotic particle with five quarks

The Wire
July 14, 2015

An artist's impression of five strongly bonded quarks in a pentaquark. Credit: CERN/LHCb Collaboration
An artist’s impression of five strongly bonded quarks in a pentaquark. Credit: CERN/LHCb Collaboration

“When you have eliminated the impossible, whatever remains, however improbable, must be the truth” – thus spake Sherlock Holmes. Particle physicists at the Large Hadron Collider today announced the discovery of a new particle after an investigation following in the steps of Holmes’ wisdom. The particle is exceedingly rare in the books of fundamental physics, called a pentaquark. It’s named for the fact that it’s composed of five quarks, indivisible particles that in their leagues make up all known matter.

However, this is the first time experimental physicists have observed five quarks coming together to make a bigger particle. They commonly manifest as protons and neutrons, which are clumps of three quarks each.

The collaboration of scientists and engineers of the LHCb detector – which spotted the pentaquarks – uploaded a paper to the arXiv preprint server on July 13 and submitted a copy to the journal Physical Review Letters for publication. The abstract describes two resonances – or unstable particles – at masses 4,380 MeV and 4,449.8 MeV (to compare, a proton weighs 938 MeV), not including uncertainties in the range 40-110 MeV. The have been temporarily designated Pc(4380)+ and Pc(4450)+.

The LHCb detector spotted the pentaquarks during the particle decays of another particle called Λb (read Lambda b). However, instead of discerning their presence by a spike in the data, the scientists spotted them by accounting for all other data points and then figuring out one consistent explanation for what was leftover. And the explanation called for conceding that the scientists had finally spotted the elusive pentaquark. “Benefitting from the large data set provided by the LHC, and the excellent precision of our detector, we have examined all possibilities for these signals, and conclude that they can only be explained by pentaquark states”, said LHCb physicist Tomasz Skwarnicki of Syracuse University in a statement.

According to the pre-print paper, the chances of the observation being a fluke, or due to some other process that could’ve mimicked the production of pentaquarks, are less than 1-in-3.5-million. As a result, the observations are sufficiently reliable and make for a discovery – even if the particle wasn’t observed as much as its unique shadow. At the same time, because the history of the experimental pursuit of pentaquarks is dotted with shepherds crying wolves, the data will be subjected to further scrutiny. In the most recent and famous case in 2003, four research labs from around the world (TJNAF, AITEP, SPring-8, ELSA) claimed to have spotted pentaquarks, only to be disproved by tests at the Istituto Nazionale di Fisica Nucleare in Genova in April 2005.

The LHC, which produces the high-energy collisions that detectors like the LHCb study in detail, shut down in early 2013 for a series of upgrades and reawakened in May 2015. The pentaquark was found in data gathered during the first run, when the LHC produced collisions at an energy of 8 TeV (1 TeV is 1,000 MeV). In the second run, the collision energy has been hiked to 13 TeV, which increases the frequency with which exotic particles like pentaquarks could be produced.