Science

  • ‘Hunters’, sci-fi and pseudoscience

    One of the ways in which pseudoscience is connected to authoritarian governments is through its newfound purpose and duty to supply an alternate intellectual tradition that subsumes science as well as culminates in the identitarian superiority of a race, culture or ethnic group. In return, aspects of the tradition are empowered by the regime both to legitimise it and to catalyse its adoption by the proverbial masses, tying faith in its precepts with agency, and of course giving itself divine sanction to rule.

    The readers of this blog will recognise the spiritual features of Hindutva that the Bharatiya Janata Party regularly draws on that fit the bill. A German rocket scientist named Willy Ley who emigrated to the US before World War II published an essay entitled ‘Pseudoscience in Naziland’ in 1947, in which he describes the sort of crazy beliefs that prepared the ground with other conditions for the advent of Nazism.

    In Hunters, the Amazon Prime show about Jewish Nazi-hunters in 1970s America, Edward Bulwer-Lytton’s sci-fi novel The Coming Race (1871) finds brief mention as a guiding text for neo-Nazis. In the novel, a subterranean race of angelic humanoids has acquired great power and superhuman abilities by manipulating a magical substance called Vril, and threatens to rise to the surface and destroy the human race one day.

    Bulwer-Lytton also wrote that Vril alludes to electricity (i.e. the flow of electrons) and that The Coming Race is an allegory about how an older generation of people finds itself culturally and political incompatible with a new world order powered by electric power. (At the same time, he believed these forces were a subset of the aether, so to speak.) In a letter to John Forster on March 20, 1870 – precisely 150 years ago in twelve days – Bulwer-Lytton wrote:

    I did not mean Vril for mesmerism, but for electricity, developed into uses as yet only dimly guessed, and including whatever there may be genuine in mesmerism, which I hold to be a mere branch current of the one great fluid pervading all nature. I am by no means, however, wedded to Vril, if you can suggest anything else to carry out this meaning – namely, that the coming race, though akin to us, has nevertheless acquired by hereditary transmission, etc., certain distinctions which make it a different species, and contains powers which we could not attain through a slow growth of time’ so that this race would not amalgamate with, but destroy us.

    And yet this race, being in many respects better and milder than we are, ought not to be represented terrible, except through the impossibility of our tolerating them or they tolerating us, and they possess some powers of destruction denied to ourselves.

    The collection of letters is available here.

    In Bulwer-Lytton’s conception, higher technological prowess was born of hereditary traits. In a previous letter, dated March 15, Bulwer-Lytton had written to Forster:

    The [manuscript] does not press for publication, so you can keep it during your excursion  and think over it among the other moonstricken productions which may have more professional demand on your attention. The only important point is to keen in view the Darwinian proposition that a coming race is destined to supplant our races, that such a race would be very gradually formed, and be indeed a new species developing itself out of our old one, that this process would be invisible to our eyes, and therefore in some region unknown to us.

    So this is not a simple confusion or innocent ignorance. Bulwer-Lytton’s attribution of the invention of electricity to genetic ability was later appropriated by interwar German socialists.

    This said, I’m not sure how much I can read into the reimagination of technological ability as a consequence of evolution or racial superiority because another part of Bulwer-Lytton’s letters suggests his example of electricity was incidental: “… in the course of the development [of the new species], the coming race will have acquired some peculiarities so distinct from our ways … and certain destructive powers which our science could not enable us to attain to, or cope with. Therefore, the idea of electrical power occurred to me, but some other might occur to you.”

    Now, according to Ley, the Society for Truth believed Vril to be a real thing and used its existence to explain how the Britons created their empire. I don’t know how much stock Adolf Hitler and his “shites of the round table” (to quote from Hunters) placed in this idea but the parallels must have been inescapable – especially so since Ley also writes that not just any pseudoscientific belief could have supported Hitler’s rise nor have acquired his patronage. Instead, the beliefs had to be culturally specific to Germany, pandering to local folklore and provincialism.

    Without commenting on whether this conclusion would apply to Fascism 2.0 in a world with the internet, civil aviation and computerised banking, and in naïve spite of history’s fondness for repeating itself and the politico-corporate-media complex, I wonder what lessons there are here – if any – for science educators, a people already caught between political anti-intellectualism and a stronger sense of their purpose in an intellectually debilitated society.

  • A great discussion on the history of India’s tech.

    On February 27, the Bangalore International Centre and Carnegie India hosted a panel discussion around Midnight Machines, the new book by Arun Mohan Sukumar that traces the interplay of technology and politics in independent India (read The Wire Science‘s review here). The panelists were Arun (my friend and former colleague at The Hindu), space entrepreneur Susmita Mohanty, Rajya Sabha MP Rajeev Gowda, historian of science Jahnavi Phalkey, and Anu Singh of Carnegie India.

    The whole discussion was about 90 minutes long, and picked up steam after the first 15 minutes or so. If you’re at all interested in the history of science and technology in India, I recommend you watch the whole thing on YouTube. If not, I’d like to draw your attention to two a few interesting (to me) passages of discussion and which I’ve also transcribed below. The parts where Arun and Phalkey directly debated each other, Arun emerged with only minor bruises, which I shouldn’t have to tell you is a considerable feat and may not have been the case in a full-on, two-person debate!

    Jahnavi Phalkey, 32:00 – The political ambition of a state is now technological ambition. That’s why the technological story of the latter half of the 20th century is a political one, and is therefore also political in India. The other aspect of this is centralisation. While we in India have argued that the Indian state centralised research funding through the CSIR, DAE, the space programme, etc. with all money going into a few facilities, look at Europe. The European answer was CERN, with countries coming together to build facilities. Apart from the UN, there was no economy then that could conduct scientific research at the scale the tone for which was set during the Second World War.

    Therefore, the centralisation solution adopted (also) in India was no different from what was happening globally. So what was happening in India was not anomalous. It’s a part of the larger story. To add a footnote to the Nehru story: Nehru spoke science, he said “scientific temper”, but look at the institutions he established: the IITs (when it was 60 years before India setup the IISERs) and the CSIR (he didn’t go for the Max Planck Institutes model, the Kaiser Wilhelm Institutes model or the Harnack principle but focused on industrial research); the IISc came 50 years before independence. So the accusation that Nehru spoke science, did science but didn’t do technology is not held out.

    [At one point, Arun also talks about how India needed a Nehru to navigate the Non-Aligned Movement to still secure favours form different governments without upsetting the precarious balance of powers (so to speak) to help set up some of India’s preeminent IITs. I skimmed through the video twice but couldn’t find the exact timestamp.]

    Arun Mohan Sukumar, 43:50 – A CSIR scientist said the failure of the solar cooker project basically ensured that all the scientists [who worked on it] retreated into the comfort of their labs and put them off “applied science”.

    Here’s a project commenced almost immediately after independence meant to create technology by Indians for Indians, and after it failed for various reasons, the political spotlight that had been put on the project was counterproductive. Nehru himself investing this kind of capital exposed him and the scientific establishment to criticism that they were perhaps not used to. These were venerated men in their respective fields and they were perhaps unused to being accountable in this visceral way. India offered a prototype of the solar cooker to Egypt and, I believe, Rhodesia or South Africa, and the joke goes that the order was never repeated. D.D. Kosambi says in an opinion piece at the time that the only person who made any profit out of the solar cooker affair is the contractor who sold it for scraps.

    This is the kind of criticism confronted by the scientific establishment and it is a consequence of politics. I agree with Prof Phalkey when she says it was a consequence of the political establishment not insulating the scientific establishment from the sort of criticism which may or may not informed but you know how the press is. That led to a gradual breaking of ranks between the CSIR and the political vision for India where you’d have these mass technologies that [Phalkey] mentioned, and you can see the best evidence for that is Nehru’s pursuit of massive industrialisation in the second Five Year Plan, from 1956 to 1961.

    This isn’t to say that Nehru was surrounded by advisers who all believed in the same thing; there was of course [P.C.] Mahalanobis who believed in a more aggressive form of industrialisation. But at various points of time one constituency was trumping another, within even the establishment. But it needs to be said that the PM was not in favour of introducing tractors in agriculture… Again, this is all criticism with the wisdom of hindsight.

    Jahnavi Phalkey, 53:16 – In the 1970s, look at the number of democratic regimes that fell due to hot wars fought during the Cold War in the rest of the world. You’ll start to see why the need for control was felt.

    Arun Mohan Sukumar [following after Rajeev Gowda’s comments], 55:05 – Another dimension is the presence of universities in the US, which incubated the military-industrial complex. Harvard and MIT in Boston and Stanford in the Silicon Valley were the nuclei for research. In India, some of these are truly unfortunate circumstances that the government has no control over. When the first batch of graduates passed out of IIT Kanpur in 1965, Lyndon B. Johnson passed the Immigration and Naturalisation Act giving Indians, and people of other nationalities, an automatic path to citizenship. So the best minds of our country were prompted by the fact that there aren’t enough jobs or enough well-paying jobs in India [to enter] a feeder line created between India and the US, from which it is very difficult to come back. Those circumstances too must be acknowledged.

    Susmita Mohanty, 56:20 – Even brain drain is hugely exaggerated. I’ve lived in four different countries. The talent pool we have in India today is as big or bigger. There are people leaving the country but not everyone is the best coder in town.

    Arun Mohan Sukumar, 57:24 – The appropriate technology movement that started in the late 1960s and early 1970s was this philosophy that grew out of Western Europe and the US which called for lesser consumption of natural resources and labour-intensive jobs with a view to conserving resources for the planet, a lot of which was precipitated by a report called ‘Limits to Growth’, which essentially predicted this catastrophe that would befall humanity by 2000.

    And then economist [E.F.] Schumacher writes this book called ‘Small is Beautiful’ [in 1973] and creates a revolution incidentally not just in advanced societies but also in developing countries, where leaders like Indira Gandhi coopted the movement to say to the people that you should consume less, conserve your natural resources and deploy labour-intensive technologies that will essentially be beneficial to you and your way of life. Seminar after seminar was organised by top institutes of the time to talk about how you can create fuel out of biogas, how you can mechanise bullock carts – technologies that are not scalable but nevertheless are quick-fixes, and this is where ‘jugaad’ has its historical origin: in the valorisation of frugal innovation.

    [Phalkey shakes her head in disagreement.]

    This would’ve been acceptable had it not been for the fact that investments in the space and nuclear programmes continued unabated. … So on the one hand the state was promoting big science and it wasn’t as if they had an ideological or political compulsion against Big Machine and big technologies. There was just factors such as financial considerations and the government’s own inability to develop technology at home which, I argue, led Indira Gandhi to co-opt the appropriate technology movement. … In India, perhaps it’s harsh to say that we moved backwards, but the objective was not to redefine technology but to shun it altogether. [Phalkey is quite in disagreement at this point.] That unfortunately is I feel a byproduct of the legacy of the 1970s.

    Jahnavi Phalkey, 1:01:14 – I have to disagree because there’s been only one science plan in the country in its history, and that was done in the 1970s under Mrs Gandhi’s regime. Eighteen-hundred people from user ministries, the Planning Commission, scientific institutions and industry sat together over 18-24 months and came up with a comprehensive plan as to how to take research happening in the institutions and in the CSIR through Planning Commission allocation of money to the user ministries. We haven’t seen anything on this scale before or since.

    Problem was as soon as Mrs Gandhi implemented the plan, she also implemented the Emergency. When the Emergency was pulled back, the Morarji Desai regime decided that India did not need [the science plan]. So the argument you’re making [addressing Arun] of scaling back on technology or technology as a solution to the social, political and other problems that India had was more due to the Janata regime and not Mrs Gandhi’s. One needs to make this small distinction because this was simply not true at the time.

    Arun Mohan Sukumar, 1:06:09 – What was remarkable to me while writing this book was this factoid that comes from this book on the history of computing in India by C.R. Subramanian: he says the import of computers to India tripled during the years of the Emergency. For my life, I can’t imagine why! But it goes to show that despite the anti-automation protests of the 1970s and 1970s, and remember that 1978 is the year when IBM quit India for whatever reasons, there was beginning to be this gradual embrace of technology and which really takes off from the 1980s. And from the moment of liberalisation in 1991, it’s a different story altogether.

    Some of these legacies continue to haunt us, whether it is popular protests against nuclear plants, which really came of age in the 1960s and 1970s, not just in India but also in other parts of the world. Some of that really bore on India as well, and I believe continued into the debate on genetically modified crops. If you ask a person who really has a strong opinion on these subjects, I wonder whether he or she would have a clear idea of what the technology is. But they evoke such strong views, and perhaps some of it is due to the constant politicisation of the virtues and vices of the technology.

    Arun Mohan Sukumar, 1:09:04 – One of the reasons why the Indian opposition to the Human Genome Project was so pronounced in the early 1990s, when the hand of invitation was extended to the Indian government, was because the Vaccine Action programme signed by Rajiv Gandhi and Ronald Reagan just a few years ago ran into a great deal of controversy within and without government; defence ministry officials said here is an effort to take DNA materials from Indians to be turned against India as an agent of biological warfare, and all sorts of rubbish.

    [How history repeats itself!]

    Adding to this, some private institutes in the US were involved in smuggling anti-rabies vaccines into developing countries. All of this spooked the scientific establishment and which, the book argues, led to us staying away from the Human Genome Project.

    … And we missed the bus. Today we say we are able to map the genome of some man from Jharkhand at a fraction of the cost – it is at a fraction of the cost because most of the work has already been done. There is some historical legacy there that unfortunately continues to haunt us.

    [Susmita Mohanty mentions ISRO’s famous reluctance to share information about components of its civilian space programme.]

    Jahnavi Phalkey, 1:12:26 – There’s also a little bit of politics to it. The information that NASA and ESA share is backed by a very, very, very strong politics of sharing. What can and cannot be shared are clearly divided.

    Jahnavi Phalkey, 1:13:57 – If you begin with Robert Clive, we have a history of about 300 years of building suspicion. And to dismantle that kind of suspicion is going to take lots of work. I’m not saying to not have participated in the Human Genome Project but that it’s not a good thing to share or that we embark on certain projects. I think we might be erring on the side of caution.

    Arun Mohan Sukumar, 1:17:58 – There are different kinds of technocracies, and the three people surveyed in the book [who represented those kinds] are M. Visvesvaraya, Vikram Sarabhai and Nandan Nilekani. They forged three different organisational structures within government (of course Visvesvaraya did so before independence), and they had different views of technology. I wouldn’t say there were all political animals but they certainly had a good appreciation of politics which was crucial to their success.

    For example, Visvesvaraya was a very astute navigator of colonial-era politics but then resigned as the diwan of Mysore over what he perceived as anti-Brahmin protests in the Madras presidency and the threat of that spilling over into Mysore. Finally, after independence, his views were totally marginalised by the establishment of the time.

    Sarabhai was in currency throughout but also in many respects was able to tell the leadership what it wanted to hear and at the same time insulate his own team from politics to the extent that ISRO today has a separate recruitment process. Some degree of autonomy was built-in.

    Nilekani’s work on Aadhaar goes the exact opposite way: he is very clear that he does not want scientists or technologists running the programme beyond the infancy… He was very sure at the beginning that an IAS officer should be running UIDAI. We can debate the merits of the decision but the fact is, in his view and the view of the team, the technocracy could only survive if it was built from within government. Whereas when Sarabhai died, Satish Dhawan was brought from Caltech to run ISRO. It was very clear for the folks behind Aadhaar that that model would not have survived.

    fin.

    Featured image: The panelists (L-R): Arun Mohan Sukumar, Susmita Mohanty, Rajeev Gowda, Jahnavi Phalkey and Anu Singh.

  • The Nobel intent

    You’ve probably tired of this but I can’t. The Nobel Prize folks just sent out a newsletter ahead of Women’s Day, on March 8, describing the achievements of female laureates of each of the six prizes. This is a customary exercise we’ve come to expect from organisations and companies trying to make themselves look good on the back of an occasion presumably designed to surmount the sort of barriers to women’s empowerment and professional success the organisations and companies often themselves perpetuate. For example, this the Nobel Prize newsletter shows off with some truly ironic language. Consider the notes accompanying the science prize winners:

    “I remember being told over and over again: Women, you can do anything, so it never entered my mind that I couldn’t.” Donna Strickland was awarded the Nobel Prize in Physics 2018 for her work with laser pulses.

    She was also the first female laureate of the physics prize in 55 years.

    [Marie Curie] is the first Nobel Prize awarded woman and the only one to have received it in Physics as well as Chemistry.

    … because the prize committee chose not to award anyone else.

    Gert Cori was the first woman to receive a Medicine Prize.

    … because the prize committee chose not to award anyone else.

    This is also what baffles me, especially in October and December every year when the awards are announced and conferred, respectively: Why do people take an award seriously that doesn’t take their issues seriously? Any other institution that did the same thing, as well as self-aggrandise as often as it can, would’ve been derided, turned into memes even, but every year we – millions of Indians at least, scientists and non-scientists – look to the Nobel Prizes to acknowledge Indian contributions to science, missing entirely the point that the prizes are a deeply flawed human enterprise riven by their own (often Eurocentric) politics and that have no responsibility to be fair, and often aren’t.

  • Dehumanising language during an outbreak

    It appears the SARS-CoV-2 coronavirus has begun local transmission in India, i.e. infecting more people within the country instead of each new patient having recently travelled to an already affected country. The advent of local transmission is an important event in the lexicon of epidemics and pandemics because, at least until 2009, that’s how the WHO differentiated between the two.

    As of today, the virus has become locally transmissible in the world’s two most populous countries. At this juncture, pretty much everyone expects the number of cases within India to only increase, and as it does, the public healthcare system won’t be the only one under pressure. Reporters and editors will be too, and they’re likely to be more stressed on one front: their readers.

    For example, over the course of March 4, the following sentences appeared in various news reports of the coronavirus:

    The Italian man infected 16 Italians, his wife and an Indian driver.

    The infected techie boarded a bus to Hyderabad from Bengaluru and jeopardised the safety of his co-passengers.

    Two new suspected coronavirus cases have been reported in Hyderabad.

    All 28 cases of infection are being monitored, the health ministry has said.

    Quite a few people on Twitter, and likely in other fora, commented that these lines exemplify the sort of insensitivity towards patients that dehumanises them, elides their agency and casts them as perpetrators – of the transmission of a disease – and which, perhaps given enough time and reception, could engender apathy and even animosity towards the poorer sick.

    The problem words seem to include ‘cases’, ‘burden’ and ‘infected’. But are they a problem, really? I ask because though I understand the complaints, I think they’re missing an important detail.

    Referring to people as if they were objects only furthers their impotency in a medical care setup in which doctors can’t be questioned and the rationale for diagnoses is frequently secreted – both conditions ripe for exploitation. At the same time, the public part of this system has to deal with a case load it is barely equipped for and whose workers are underpaid relative to their counterparts in the private sector.

    As a result, a doctor seeing 10- or 20-times as many patients as they’ve been trained and supported to will inevitably precipitate some amount of dehumanisation, and it could in fact help medical workers cope with circumstances in which they’re doing all they can to help but the patient suffers anyway. So dehumanisation is not always bad.

    Second, and perhaps more importantly, the word ‘dehumanise’ and the attitude ‘dehumanise’ can and often do differ. For example, Union home minister Amit Shah calling Bangladeshi immigrants “termites” is not the same as a high-ranking doctor referring to his patient in terms of their disease, and this doctor is not the same as an overworked nurse referring to the people in her care as ‘cases’. The last two examples are progressively more forgivable because their use of the English language is more opportunistic, and the nurse in the last example may not intentionally dehumanise their patients if they knew what their words meant.

    (The doctor didn’t: his example is based on a true story.)

    Problematic attitudes often manifest most prominently as problematic words and labels but the use of a word alone wouldn’t imply a specific attitude in a country that has always had an uneasy relationship with the English language. Reporters and editors who carefully avoid potentially debilitating language as well as those who carefully use such language are both in the minority in India. Instead, my experiences as a journalist over eight years suggest the majority is composed of people who don’t know the language is a problem, who don’t have the time, energy and/or freedom to think about casual dehumanisation, and who don’t deserve to be blamed for something they don’t know they’re doing.

    But by fixating on just words, and not the world of problems that gives rise to them, we risk interrogating and blaming the wrong causes. It would be fairer to expect journalists of, say, the The Guardian or the Washington Post to contemplate the relationship between language and thought if only because Western society harbours a deeper understanding of the healthcare system it originated, and exported to other parts of the world with its idiosyncrasies, and because native English speakers are likelier to properly understand the relationship between a word, its roots and its use in conversation.

    On the other hand, non-native users of English – particularly non-fluent users – have no option but to use the words ‘case’, ‘burden’ and ‘infected’. The might actually prefer other words if:

    • They knew that (and/or had to accommodate their readers’ pickiness for whether) the word they used meant more than what they thought it did, or
    • They knew alternative words existed and were equally valid, or
    • They could confidently differentiate between a technical term and its most historically, socially, culturally and/or technically appropriate synonym.

    But as it happens, these conditions are seldom met. In India, English is mostly reserved for communication; it’s not the language of thought for most people, especially most journalists, and certainly doesn’t hold anything more than a shard of mirror-glass to our societies and their social attitudes as they pertain to jargon. So as such, pointing to a reporter and asking them to say ‘persons infected with coronavirus’ instead of ‘case’ will magically reveal neither the difference between ‘case’ or ‘infected’ the scientific terms and ‘case’ or ‘infected’ the pejoratives nor the negotiated relationship between the use of ‘case’ and dehumanisation. And without elucidating the full breadth of these relationships, there is no way either doctors or reporters are going to modify their language simply because they were asked to – nor will their doing so, on the off chance, strike at the real threats.

    On the other hand, there is bound to be an equally valid problem in terms of those who know how ‘case’ and ‘infected’ can be misused and who regularly read news reports whose use of English may or may not intend to dehumanise. Considering the strong possibility that the author may not know they’re using dehumanising language and are unlikely to be persuaded to write differently, those in the know have a corresponding responsibility to accommodate what is typically a case of the unknown unknowns and not ignorance or incompetence, and almost surely not malice.

    This is also why I said reporters and editors might be stressed by their readers, rather their perspectives, and not on count of their language.


    A final point: Harsh Vardhan, the Union health minister and utterer of the words “The Italian man infected 16 Italians”, and Amit Shah belong to the same party – a party that has habitually dehumanised Muslims, Dalits and immigrants as part of its nationalistic, xenophobic and communal narratives. More recently, the same party from its place at the Centre suspected a prominent research lab of weaponising the Nipah virus with help from foreign funds, and used this far-fetched possibility as an excuse to terminate the lab’s FCRA license.

    So when Vardhan says ‘infected’, I reflexively, and nervously, double-check his statement for signs of ambiguity. I’m also anxious that if more Italian nationals touring India are infected by SARS-CoV-2 and the public healthcare system slips up on control measures, a wave of anti-Italian sentiment could follow.

  • A new beginning

    When The Wire was launched on May 11, 2015, we (the editors) decided to organise the site’s content within six principal categories: politics, political economy, foreign affairs, science, culture and law.

    In the five years since, the Big Three categories — politics, political economy and foreign affairs — have come to dominate The Wire‘s identity as a digital news site, even as our science category has acquired a voice of its own and performed much better than we expected. And yet, given the crush of ‘conventional’ news, science has not been able to voice at its fullest on the crowded pages of The Wire.

    To fix this issue as well as to give our science stories the freedom — and responsibility — to constitute their own publication (of sorts), we launched on February 28 The Wire Science as its own beast: https://science.thewire.in.

    While we remain strapped for resources, we recognise that it’s a necessary step in the road to the top: an Indian independent, fully reader-funded, science news, analysis and commentary website. That said, we will begin populating the new site with shorter, longer and different types of stories that we can already afford and which now have the breathing room they need.

    As always, please engage with The Wire Science, share the stories you like, comment and discuss on Twitter and Facebook, send your bouquets and brickbats to science at thewire dot in, and please donate (especially if you can). This is all we need for the trek. 🙂

  • Freeman Dyson’s PhD

    The physicist, thinker and writer Freeman Dyson passed away on February 28, 2020, at the age of 96. I wrote his obituary for The Wire Science; excerpt:

    The 1965 Nobel Prize for the development of [quantum electrodynamics] excluded Dyson. … If this troubled Dyson, it didn’t show; indeed, anyone who knew him wouldn’t have expected differently. Dyson’s life, work, thought and writing is a testament to a philosophy of doing science that has rapidly faded through the 20th century, although this was due to an unlikely combination of privileges. For one, in 1986, he said of PhDs, “I think it’s a thoroughly bad system, so it’s not quite accidental that I didn’t get one, but it was convenient.” But he also admitted it was easier for him to get by without a PhD.

    His QED paper, together with a clutch of others in mathematical physics, gave him a free-pass to more than just dabble in a variety of other interests, not all of them related to theoretical physics and quite a few wandering into science fiction. … In 1951, he was offered a position to teach at Cornell even though he didn’t have a doctorate.

    Since his passing, many people have latched on to the idea that Dyson didn’t care for awards and that “he didn’t even bother getting a PhD” as if it were a difficult but inspiring personal choice, and celebrate it. It’s certainly an unlikely position to assume and makes for the sort of historical moment that those displeased with the status quo can anchor themselves to and swing from for reform, considering the greater centrality of PhDs to the research ecosystem together with the declining quality of PhD theses produced at ‘less elite’ institutions.

    This said, I’m uncomfortable with such utterances when they don’t simultaneously acknowledge the privileges that secured for Dyson his undoubtedly deserved place in history. Even a casual reading of Dyson’s circumstances suggests he didn’t have to complete his doctoral thesis (under Hans Bethe at Cornell University) because he’d been offered a teaching position on the back of his contributions to the theory of quantum electrodynamics, and was hired by the Institute for Advanced Study in Princeton a year later.

    It’s important to mention – and thus remember – which privileges were at play so that a) we don’t end up unduly eulogising Dyson, or anyone else, and b) we don’t attribute Dyson’s choice to his individual personality alone instead of also admitting the circumstances Dyson was able to take for granted and which shielded him from adverse consequences. He “didn’t bother getting a PhD” because he wasn’t the worse for it; in one interview, he says he feels himself “very lucky” he “didn’t have to go through it”. On the other hand, even those who don’t care for awards today are better off with one or two because:

    • The nature of research has changed
    • Physics has become much more specialised than it was in 1948-1952
    • Degrees, grants, publications and awards have become proxies for excellence when sifting through increasingly overcrowded applicants’ pools
    • Guided by business decisions, journals definition of ‘good science’ has changed
    • Vannevar Bush’s “free play of free intellects” paradigm of administering research is much less in currency
    • Funding for science has dropped, partly because The War ended, and took a chunk of administrative freedom with it

    The expectations of scientists have also changed. IIRC Dyson didn’t take on any PhD students, perhaps as a result of his dislike for the system (among other reasons because he believed it penalises students not interested in working on a single problem for many years at a time). But considering how the burdens on national education systems have shifted, his decision would be much harder to sustain today even if all of the other problems didn’t exist. Moreover, he has referred to his decision as a personal choice – that it wasn’t his “style” – so treating it as a prescription for others may mischaracterise the scope and nature of his disagreement.

    However, questions about whether Dyson might have acted differently if he’d had to really fight the PhD system, which he certainly had problems with, are moot. I’m not discussing his stomach for a struggle nor am I trying to find fault with Dyson’s stance; the former is a pointless consideration and the latter would be misguided.

    Instead, it seems to me to be a question of what we do know: Dyson didn’t get a PhD because he didn’t have to. His privileges were a part of his decision and cemented its consequences, and a proper telling of the account should accommodate them even if only to suggest a “Dysonian pride” in doing science requires a strong personality as well as a conspiracy of conditions lying beyond the individual’s control, and to ensure reform is directed against the right challenges.

    Featured image: Freeman Dyson, October 2005. Credit: ioerror/Wikimedia Commons, CC BY-SA 2.0.

  • Mad Mike: Foolish Road

    On Sunday, an American thrill-seeker named Mike Hughes died after attempting to launch himself to an altitude of 5,000 feet on a homemade steam-powered rocket. A video of the accident is available because a crew of the Science Channel filmed the incident as part of a programme called ‘Homemade Astronauts’. On February 23, Science Channel tweeted condolences to his loved ones, and said Hughes had died trying to fulfil his dream. But in fact he had died for no reason at all.

    Hughes believed Earth was flat and had hoped to ‘prove’ it by flying himself to space, which makes Science Channel’s conduct irresponsible if not entirely reckless. I assume here that the Science Channel knows Earth is an oblate spheroid in shape as well as knows how such knowledge was obtained. But it still decided to capitalise on the ignorance of another person, presumably in the names of objectivity and balance, and let them put themselves in danger (with airtime on the Science Channel as an incentive).

    For his part, Hughes wasn’t very smart either: aside from thinking Earth is flat, he could never have proven, or disproven, his claim by flying to 5,000 feet. Millions of people routinely fly on airplanes that cruise at 35,000 feet and have access to windows. Even at this altitude, Earth’s curvature is not apparent because the field of view is not wide enough. Hughes likely would have had some success (or failure, depending on your PoV) if he had been able to reach, say, 40,000 feet on a cloud-free day.

    But even then, the Kármán line – the region beyond which is denoted space – lies 328,000 feet up. So by flying to a height of 5,000 feet, Hughes was never going to be an astronaut in any sense of the term nor was he going to learn anything new, except of course finding new reasons to persist with his ignorance. On the other hand, a TV channel called ‘Science’ quite likely knew all this and let Hughes carry on anyway – instead of, say, taking him to a beach and asking him to watch ships rise as if from under the horizon.

  • Distracting from the peer-review problem

    From an article entitled ‘The risks of swiftly spreading coronavirus research‘ published by Reuters:

    A Reuters analysis found that at least 153 studies – including epidemiological papers, genetic analyses and clinical reports – examining every aspect of the disease, now called COVID-19 – have been posted or published since the start of the outbreak. These involved 675 researchers from around the globe. …

    Richard Horton, editor-in-chief of The Lancet group of science and medical journals, says he’s instituted “surge capacity” staffing to sift through a flood of 30 to 40 submissions of scientific research a day to his group alone.

    … much of [this work] is raw. With most fresh science being posted online without being peer-reviewed, some of the material lacks scientific rigour, experts say, and some has already been exposed as flawed, or plain wrong, and has been withdrawn.

    “The public will not benefit from early findings if they are flawed or hyped,” said Tom Sheldon, a science communications specialist at Britain’s non-profit Science Media Centre. …

    Preprints allow their authors to contribute to the scientific debate and can foster collaboration, but they can also bring researchers almost instant, international media and public attention.

    “Some of the material that’s been put out – on pre-print servers for example – clearly has been… unhelpful,” said The Lancet’s Horton.

    “Whether it’s fake news or misinformation or rumour-mongering, it’s certainly contributed to fear and panic.” …

    Magdalena Skipper, editor-in-chief of Nature, said her group of journals, like The Lancet’s, was working hard to “select and filter” submitted manuscripts. “We will never compromise the rigour of our peer review, and papers will only be accepted once … they have been thoroughly assessed,” she said.

    When Horton or Sheldon say some of the preprints have been “unhelpful” and that they cause panic among the people – which people do they mean? No non-expert person is hitting up bioRxiv looking for COVID-19 papers. They mean some lazy journalists and some irresponsible scientists are spreading misinformation, and frankly their habits represent a more responsible problem to solve instead of pointing fingers at preprints.

    The Reuters analysis also says nothing about how well preprint repositories as well as scientists on social media platforms are conducting open peer-review, instead cherry-picking reasons to compose a lopsided argument against greater transparency in the knowledge economy. Indeed, crisis situations like the COVID-19 outbreak often seem to become ground zero for contemplating the need for preprints but really, no one seems to want to discuss “peer-reviewed” disasters like the one recently publicised by Elisabeth Bik. To quote from The Wire (emphasis added),

    [Elisabeth] Bik, @SmutClyde, @mortenoxe and @TigerBB8 (all Twitter handles of unidentified persons), report – as written by Bik in a blog post – that “the Western blot bands in all 400+ papers are all very regularly spaced and have a smooth appearance in the shape of a dumbbell or tadpole, without any of the usual smudges or stains. All bands are placed on similar looking backgrounds, suggesting they were copy-pasted from other sources or computer generated.”

    Bik also notes that most of the papers, though not all, were published in only six journals: Artificial Cells Nanomedicine and BiotechnologyJournal of Cellular BiochemistryBiomedicine & PharmacotherapyExperimental and Molecular PathologyJournal of Cellular Physiology, and Cellular Physiology and Biochemistry, all maintained reputed publishers and – importantly – all of them peer-reviewed.

  • Peter Higgs, self-promoter

    I was randomly rewatching The Big Bang Theory on Netflix today when I spotted this gem:

    Okay, maybe less a gem and more a shiny stone, but still. The screenshot, taken from the third episode of the sixth season, shows Sheldon Cooper mansplaining to Penny the work of Peter Higgs, whose name is most famously associated with the scalar boson the Large Hadron Collider collaboration announced the discovery of to great fanfare in 2012.

    My fascination pertains to Sheldon’s description of Higgs as an “accomplished self-promoter”. Higgs, in real life, is extremely reclusive and self-effacing and journalists have found him notoriously hard to catch for an interview, or even a quote. His fellow discoverers of the Higgs boson, including François Englert, the Belgian physicist with whom Higgs won the Nobel Prize for physics in 2013, have been much less media-shy. Higgs has even been known to suggest that a mechanism in particle physics involving the Higgs boson should really be called the ABEGHHK’tH mechanism, include the names of everyone who hit upon its theoretical idea in the 1960s (Philip Warren Anderson, Robert Brout, Englert, Gerald Guralnik, C.R. Hagen, Higgs, Tom Kibble and Gerardus ‘t Hooft) instead of just as the Higgs mechanism.

    No doubt Sheldon thinks Higgs did right by choosing not to appear in interviews for the public or not writing articles in the press himself, considering such extreme self-effacement is also Sheldon’s modus of choice. At the same time, Higgs might have lucked out and be recognised for work he conducted 50 years prior probably because he’s white and from an affluent country, both of which attributes nearly guarantee fewer – if any – systemic barriers to international success. Self-promotion is an important part of the modern scientific endeavour, as it is with most modern endeavours, even if one is an accomplished scientist.

    All this said, it is notable that Higgs was also a conscientious person. When he was awarded the Wolf Prize in 2004 – a prestigious award in the field of physics – he refused to receive it in person in Jerusalem because it was a state function and he has protested Israel’s war against Palestine. He was a member of the Campaign for Nuclear Disarmament until the group extended its opposition to nuclear power as well; then he resigned. He also stopped supporting Greenpeace after they become opposed to genetic modification. If it is for these actions that Sheldon deemed Higgs an “accomplished self-promoter”, then I stand corrected.

    Featured image: A portrait of Peter Higgs by Lucinda Mackay hanging at the James Clerk Maxwell Foundation, Edinburgh. Caption and credit: FF-UK/Wikimedia Commons, CC BY-SA 4.0.

  • The potential energy of being entertained

    Netflix just published a report drafted by its Sustainability Accounting Standards Board, estimating – among other things – its environmental footprint for operations during the year 2019. According to the report, as The Guardian columnist Arwa Mahdawi writes:

    Binge-watching Netflix doesn’t just fry your brain; it may also be frying the planet. The streaming service’s global energy consumption increased by 84% in 2019 to a total of 451,000 megawatt hours – enough to power 40,000 average US homes for a year.

    This is staggering but not surprising. Through history, the place at which energy is consumed to produce a product has been becoming less and less strongly associated with where the product is likely to be purchased. The invention of sails, the steam engine, the internal combustion engine and then satellites each rapidly transformed the speed at which goods could traverse Earth’s surface as well as the speed at which consumers could make more and more informed – therefore more and more rational – choices, assisted by economic reforms like globalisation and foreign direct investment.

    The most recent disruption on this front was wrought, of course, by the internet and a little later the cloud. Now, with industries like movie-making, gaming, digital publishing and even large-scale computing, nothing short of a full-planet energy-accounting exercise makes sense. At the same time economic power, inequality and effective governance remain unevenly distributed, leading to knotty problems about determining how much each consumer of a company’s product is effectively responsible for the total energy required to make all products in that batch (since scale also matters).

    Such accounting exercises have become increasingly popular, as they should be; private enterprises like Netflix as well as government organisations have started counting their calories – their carbon intake, output, emissions, trade, export, etc. – as a presumable first step towards limiting greenhouse gas emissions and helping keep Earth’s surface from warming any more than is already likely (2º C by 2100).

    There is a catch, of course: it’s difficult to affect or even estimate the relative contribution of one’s operations to the effort to restrict global warming without also accounting for one’s wider economic setting. For example, Netflix likely displaced the DVD rental industry as well as stole users, and their respective carbon ‘demand’, from cable. So Mahdawi’s ringing the alarm bells based on Netflix’s report alone is only meaningful in a stand-alone scenario in which the status quo is ‘memoryless’.

    However, even in this contextually limited aspiration to lower emissions and its attendant benefits for human wellbeing, joy, hope and optimism don’t seem to feature as much or, in many cases, at all.

    Knowing Earth is already headed for widespread devastation can certainly smother action and deflate resolve. But while journalists and researchers alike have been debating the pros and cons of using positive or negative messaging as the better way to spur climate action, their most popular examples are rooted in quantifiable tasks or objects: either “Earth is getting more screwed by the hour but you can help by segregating your trash, using public transport and quitting meat” or “Sea-levels are rising, the Arctic is melting and heat-waves are becoming more frequent and more intense”.

    It seems as if happiness cannot fit into either paradigm without specifying the number of degrees by which it will move the climate action needle. So it also becomes easily excluded from conversations about climate-change adaptation and mitigation. As Mahdawi writes in her column,

    Being a conscientious consumer does not mean you have to turn off your wifi or chill with the Netflix. But we should think more critically about our data consumption. Apple already delivers screen-time reports; perhaps tech services should start providing us with carbon counts. Or maybe Netflix should implement carbon warnings. Caution: this program contains nudity, graphic language and a hell of a lot of energy.

    If Netflix did issue such a warning, it would no longer be a popular pastime.

    One of the purposes of popular culture, beyond its ability to channel creative expression and empower artists, is entertainment. We consume the products of popular culture, nucleated as music, dance, theatre, films, TV shows, books, paintings, sculptures and other forms, among other reasons to understand and locate ourselves outside the systems of capitalist production, to identify ourselves as members of communities, groups, cities or whatever by engaging with knowledge, objects – whether a book or the commons – or experiences that we have created, to assert that we are much more than where we work or what we earn.

    Without these spaces and unfettered access to them, we become less able to escape the side-effects of neoliberalism: consumerism, hyper-competitiveness, social isolation and depression. I’m not saying you are likelier to feel depressed without Netflix but that Netflix is one of many sources of cultural information, and is therefore an instrument with which people around the world gather in groups based on cultural preferences – forming, in turn, a part of the foundation on which people are inspired to have new ideas, are motivated to act, and upon which they even expand their hopes and ambitions.

    Of course, Netflix is itself a product of 21st century capitalism plus the internet. Like iTunes, YouTube, Prime, Disney, etc. Netflix is a corporation that has eased access to many petabytes of entertainment data across the globe but by rendering artists and entertainers even less powerful than they were and reducing their profits (rather, limiting their profits’ growth). The oft-advanced excuse, that the company simply levies a fee in return for easing barriers to discover new audiences, doesn’t always square off properly with the always-increasing labour required to create something new. So simply asking Netflix to not display a warning about the amount of energy required to produce a show may seem like a half-measure designed to fight off all of capitalism’s monsters except one.

    We have a responsibility to iteratively replace the most problematic ways in which we profit from labour and generate wealth with practices that improve economic equality, social dignity, and access to education, healthcare and good living conditions. However, how do we balance this responsibility with a million people being able to watch a cautionary documentary about the rise of fascism in 1930s’ Germany, a film about the ills of plastic use or an explainer about the ways in which trees do and don’t fight global warming?

    Binge-watching is bad – in terms of consuming enough energy to “power 40,000 average US homes for a year” as well as in other ways – but book-keepers seem content to insulate the act of watching itself from what is being watched, perhaps in an effort to determine the absolute worst case scenario or because it is very hard to understand, leave alone discern or even predict, the causal relationships between how we feel, how we think and how we act. However, this is also what we need: to accommodate, but at the same time without being compelled to quantify, the potential energy that arises from being entertained.