Uncategorized

  • Billow clouds, shocked streams & shedding eddies

    I flew from Bangalore to Delhi on Tuesday. The flight was early in the day, at 6, and so I had the wonderful opportunity to watch a sunrise from above a sea of clouds. One very beautiful sight was the presence of uniquely shaped ones, styled like the waves in Hokusai’s The Great Wave off Kanagawa.

    'The Great Wave off Kanagawa'
    Photo: Wikimedia Commons

    I recalled having seen them in Tuticorin sometime in late 2011, but I couldn’t remember what they were called. Their vortex-like upper tips had me confuse them briefly with a Karman vortex street. Thankfully, one Google search led to another and I came upon the answer: billow clouds.

    A photograph of billow clouds.
    A photograph of billow clouds. Photo: wunderground.com

    Billow clouds, I re-learnt, are the result of what’s called a Kelvin-Helmholtz instability: When two fluids of different densities and sharing a surface are moving parallel to each other, the surface becomes unstable if their relative velocity reaches a certain threshold.

    When there’s talk of fluids, surface tension is likely to be involved. Fortunately, that’s what the relative velocity component takes care of. However, “surface tension is not relevant on atmospheric scales,” said Dr. Rajaram Nityananda, of IISER, Pune.

    More interestingly, subtle variations on the Kelvin-Helmholtz instability give rise to more complex shapes, and even more complex titles. For example, if the lighter fluid is pushing against the heavier fluid, a Rayleigh-Taylor instability* results. A memorable manifestation of this is the mushroom cloud that forms after a powerful nuclear explosion, where cooler air is pushing into the debris rising upward.

    A mushroom cloud rising from the Castle Romeo nuclear test, 1954.
    Image: Wikimedia Commons

    If you sent a If you sent a shockwave through two parallely flowing fluids, you’d get the Richtmyer-Meshkov instability. The shockwave will cause both fluids to accelerate and waver, the extent of which builds up over time. If the heavier accelerates into the lighter one, it pushes through as spikes. If the lighter accelerates into the heavier one, it produces bubbles. Eventually, the instability builds up until the two fluids are mixed.

    Simulation of a shockwave-induced Richtmyer-Meshkov instability.
    Simulation of a shockwave-induced RM instability. Image: Wikimedia Commons

    This could be leveraged in the working of jet engines. A parallel flow of fuel and oxygen could be destabilized using a shockwave so the fuel is broken up into finer droplets that are easier to combust.

    At last, we come to my “phenomenological” favorite (not that there’s a list): the Karman vortex street. Instead of there being two fluids, imagine just the one, in whose path a blunt obstacle is placed. When it meets the obstacle, the fluid is split into two swirling streams. If the fluid was flowing fast enough, given the shape of the obstacle, the streams reconcile their paths after crossing the obstacle by forming vortices – sometimes a street of them.

    Notice the gradual onset of instability until the 49th second. Karman vortices are evidently not hard to find as many satellite images of winds blowing past small islands have shown.

    Image: http://disc.sci.gsfc.nasa.gov/
    Image: http://disc.sci.gsfc.nasa.gov/

    These effects are as astounding as the foundational principles are elegant. If simple disturbances on one and two streams are responsible for a variety of designs, imagine what the depthless roster of fluid dynamics will have to offer.

  • Motion charts in R

    Formatting a motion chart in Google Visualization is as hellish as it’s surprising how simple R makes it. One more reason to learn this underrated language. To create this motion chart, I downloaded the data from here (as CSV) and keyed in the following lines in RStudio:

    [code language=”r”]

    emissions <- read.csv("–path to file–/emissions.csv", header=T)
    install.packages("googleVis")
    library("googleVis)
    emissions_Motion = gvisMotionChart(emissions, idvar="Country", timevar="Year")
    plot(emissions_Motion)
    cat(emissions_Motion$html$chart, file="emissions.html")

    [/code]

    That’s it.

  • Science Quiz – July 7, 2014

    Every week, I create a science quiz for The Hindu newspaper’s In School product. It consists of 10 questions and only developments from the week preceding its day of publication (Monday). The answers are at the end.

    1. The _______ region of southwest China is some 4.5 km above sea-level. At this altitude, the air is rarefied and makes breathing difficult for humans. However, the _______ people are an exception, according to American and European scientists. On July 2, they said they had found a gene these people had inherited from an extinct human species of humans called the Denisovans that enabled them to breathe and live normally in areas where the air was thin. Fill in the blank with the name of the region or the people.
    2. On July 2, NASA launched a satellite named OCO-2 that will monitor Earth’s carbon dioxide levels 24 times every second. Specifically, it will record where on Earth carbon dioxide is being produced and where it is being removed from the air, revealing a very detailed picture of this greenhouse gas. What does OCO stand for?
    3. Name the first storm of the 2014 Atlantic hurricane season which is also one of the earliest hurricanes to have occurred in a calendar year.
    4. The __ ____ is a system of warm ocean temperatures that occurs over the Pacific Ocean and influences how strong or weak the Indian monsoons can be. Usually, the part of the Pacific close to the coast of South America becomes warmer than usual, and the part close to Indonesia becomes cooler. However, in 2009, the entire ocean showed signs of warming, which according to many climate models reduced the strength of the 2009 monsoon season in India, which ended in a drought. A week ago, the World Meteorological Organization issued an assessment that the same kind of warming was happening in 2014 as well, and that’s why this year’s monsoons could be weak. Fill in the blank with the name of the warming phenomenon, which in Spanish means “The Boy” – a reference to a young Jesus because this phenomenon’s effect is noticed around Christmas.
    5. A DNA analysis of more than 30 hair samples purportedly from the creature called _______ are actually from cows, bears, raccoon and some other animals, according to scientists from Oxford University, July 2. Fill in the blank with the name of a long-sought creature that has also been known as a Yeti in the Himalayan region.
    6. On June 30, ecologists from Spain said they had made a strange observation: according to them, there were only some 7,000 to 35,000 tons of plastic in the world’s oceans where there should have been millions of tons. They were able to arrive at this number by travelling around the world on a ship called the _________ in 2010, studying plastic concentrations. They have two explanations for this: either the they are being disintegrated into smaller and smaller bits, or they are being carried deeper into the ocean. Name the ship.
    7. On July 5, 1687 – 327 years ago – the great British physicist and mathematician Isaac Newton published the book that first described his laws of motion and law of universal gravitation. The book has a long name, and is colloquially known just by the third word of its name, which means “Principles” in Latin. What is it?
    8. What is the Cassini Grand Finale?
    9. If sea ice continues to melt at the rate at which it is melting now, the world’s population of _______ ________ will be cut by 50%, according to a new study published on June 30. Fill in the blank with the name of a bird which has been made famous through movies like ‘Happy Feet’.
    10. July 1 was the 368th birth anniversary of a famous German philosopher and mathematician. He is acknowledged as one of the inventors of the mathematical tool called calculus, and for his extensive work on mechanical calculators, refining the binary system used in modern computers, and for his optimistic philosophy. Name him.

    Answers

    1. Tibetan
    2. Orbiting Carbon Observatory
    3. Hurricane Arthur
    4. El Niño
    5. Bigfoot
    6. Malaspina
    7. Principia (the full name is ‘Philosophiae Naturalis Principia Mathematica‘)
    8. NASA has planned that, starting in late 2016, the Cassini spacecraft currently orbiting Saturn will start orbiting between the planet and its innermost ring before plunging into the gas giant to kill itself by September 2017.
    9. Emperor penguins
    10. Gottfried Wilhelm von Leibniz
  • Draft policy on increasing access to DBT/DST research

    An Open Access Policy Committee has drafted a policy to enhance access to publicly funded research by setting up a national open access (OA) repository under the oversight of the Department of Biotechnology (DBT) and the Department of Science and Technology (DST). Reproduced in full here:

    [scribd id=232706860 key=key-F0DyDEF5RFt1wUqxpkDO mode=scroll]

    This is a very good move that that will highlight what OA can do to spur scientific research and science communication in the country. It will also

    • foster a “richer research culture” as the draft says,
    • increase accountability and tractability of public funds and the research it sponsors, and
    • make the process of resource selection/allocation more transparent.

    Some quick points:

    1. Accountability of DBT/DST-controlled research by mandating uploaded papers to mention grant ID.
    2. Papers should be deposited in OA repositories once accepted by a journal, but OA will be enabled only when embargo lifts. So maybe the DBT/DST OA repositories will be like a national pre-print server – but depends on the nature of the embargo
    3. The paper (pre-print?) will be OA whether or not the journal is OA. Moreover, “Publisher agrees to provide to Author within 14 days of first publication and at no charge an electronic copy of the published Article in a format … that preserves final page layout, formatting, and content. No technical restriction, such as security settings, will be imposed to prevent copying or printing of the document. ” What if highly profitable non-OA journals based outside the country (which researchers aspire to publish in to secure advantages in non-DBT/DST settings) disagree?
    4. An author who cannot furnish his/her publication ID will not be considered for promotions, fellowships, research grants, etc., if his/her institution is under the administrative control of DBT/DST. On the other hand, how will conflicts of interest/nepotism be prevented in this regard?
    5. The DBT/DST will bear the cost of maintaining the central repository, which should eliminate conflicts of interest arising from payment-for-publication. Will the DBT/DST help set up institutional repositories? Since these IRs have to be “interoperable”, what are the standards the administration has in mind?
    6. What about research that is funded by private parties? What fraction of research funding should the DBT/DST bear for the paper to be mandatorily deposited in an OA repository?
  • New Higgs results show signs of SUSY

    Two years ago, physicists working on the Large Hadron Collider first announced the discovery of a Higgs boson-like particle, setting the high-energy physics community atwitter. And it was only a couple weeks ago that physicists also announced that the particle was definitely the one predicted by the sturdy Standard Model of particle physics, the theory that governs the Higgs boson’s properties and behavior.

    But new results from the ongoing International Conference on High Energy Physics in Valencia, Spain, could add a twist to this plot. Physicists announced that they had evidence – albeit not strong enough – that the Higgs boson was showing signs of disobeying the model.

    Members of the ATLAS and CMS collaborations, who work with the detectors of that name, said they had results showing the Higgs boson was decaying into a pair of particles called W bosons at a rate some 20% higher than predicted by the Standard Model. This non-compliance will be a breath of fresh air for physicists who have been faithful to a potent but as yet unobserved theory of new physics called supersymmetry, in short and fondly SUSY.

    The W boson mediates the decay of radioactive substances in nature. At sufficiently high energies, such as produced inside the Large Hadron Collider (LHC), these bosons are produced by a multitude of particle interactions. Since their discovery in 1983, they have been widely studied. In these circumstances, announcing signs of SUSY through Higgs decays into WW pairs provides little room for uncertainties.

    SUSY predicts that for every fermion, or matter particle, of the Standard Model there is a partner particle that is a boson called a sfermion. Conversely, for every boson, or force particle, of the Standard Model there is a partner particle that is a fermion called a bosino. Physicists who believe SUSY is a plausible theory use these extra particles to solve problems that the Standard Model can’t. One of them is that of dark matter; another is to explain why the Higgs boson weighs much lighter than it should.

    Jong Soo Kim et al have described how the anomalous decay rates could be explained using a simple version of SUSY in a pre-print paper uploaded to arXiv on June 27. The paper is playfully titled ‘Stop that ambulance! New physics at the LHC?‘. The ‘Stop’ is a reference to the name of the suppersymmetric partner of the top quark. The authors describe how a combination of supersymmetric particles including the stop boson could explain the new results with only a 1-in-370 chance of error. Even though this means physicists have a confidence of 99.7% in the results, it’s still not high to claim evidence. When the LHC comes online in 2015, physicists will be eager to put these results to the test.

    The paper’s title might also refer to a comment that physicist Chris Parkes, spokesperson for the UK participation in the LHCB experiment at the LHC, made to the BBC during the Hadron Collider Physics Symposium in Kyoto, Japan, in November 2012. Results had been announced of the B_s meson decaying into lighter particles at a rate predicted exactly by the Standard Model, nudging SUSY further toward impossibility. Parkes had said, “Supersymmetry may not be dead but these latest results have certainly put it into hospital.”

  • The federation of our digital identities

    Facebook, Twitter, email, WordPress, Instagram, online banking, the list goes on… Offline, you’re one person maintaining (presumably) one identity. On the web, you have many of them. All of them might point at you, but they’re still distinct packets of data floating through different websites. Within each site, your identity is unified, but between them, you’re different people. For example, I can’t log into Twitter with my Facebook username/password because Facebook owns them. When digital information becomes federated like this, it drives down cross-network accountability because my identity doesn’t move around.

    However, there are some popular exceptions to this. Facebook and Twitter don’t exchange my log-in credentials – the keys with which I unlock my identity – because they’re rivals, but many other services and these sites are not. For example, I can log into my YouTube account using my GMail credentials. When I hit ‘Submit’, YouTube banks on the validity of my identity on GMail to log me in. Suddenly, GMail and YouTube both have access to my behavioral information through my username now. In the name of convenience, my online visibility has increased and I’ve become exposed to targeted advertising, likely the least of ills.

    The Crypto-Book

    John Maheswaran, a doctoral student at Yale University, has a solution. He’s called it ‘Crypto-Book’, describing its application and uses in a pre-print paper he and his colleagues uploaded to arXiv on June 16.

    1. The user clicks ‘Sign up using Facebook’ on StackOverflow.

    stackoverflow

    2. StackOverflow redirects the user to Facebook to log in using Facebook credentials, 3. after which the user grants some permissions.

    facebook

    4. Facebook generates a temporary OAuth access token corresponding to the permissions.

    5. Facebook redirects the user back to StackOverflow along with the access token.

    redirection

     

    6. StackOverflow can now access the user’s Facebook resources in line with the granted permissions.

    Crypto-Book sits between steps 1 and 6. Instead of letting Facebook and StackOverflow talk to each other, it steps in to take your social network ID from Facebook, uses that to generate a username and password (in this context called a public and private key, respectively), and passes them on to StackOverflow for authentication.

    OpenID and OAuth

    It communicates with both sites using the OAuth protocol, which came into use in 2010. Five years before this, the OpenID protocol had launched to some success. In either case, the idea was to reduce the multiplicity of digital identities but in the context of sites like Facebook and Twitter that could own your identities themselves, the services the protocols provided enabled users to wield more control over what information they shared, or at least keep track of it.

    OpenID let users to register with itself, and then functioned as a decentralized hub. If you wanted to log into WordPress next, you could do so with your OpenID credentials; WordPress only had to recognize the protocol. In that sense, it was like, say, Twitter, but with the sole function of maintaining a registry of identities. Its use has since declined because of a combination of its security shortcomings and other sites’ better authentication schemes. OAuth, on the other hand, has grown more popular. Unlike OpenID, OAuth is an identity access protocol, and gives users a way to grant limited-access permissions to third-party sites without having to enter any credentials (a feature called pseudo-authentication).

    So Crypto-Book inserts itself as an anonymizing layer to prevent Facebook and StackOverflow from exchanging tokens with each other. Maheswaran also describes additional techniques to bolster Crypto-Book’s security. For one, a user doesn’t receive his/her key pair from one server but many, and has to combine the different parts to make the whole. For another, the user can use the key-pair to log in to a site using a technique called linkable ring sgnatures, “which prove that the signer owns one of a list of public keys, without revealing which key,” the paper says. “This property is particularly useful in scenarios where trust is associated with a group rather than an individual.”

    The cryptocurrency parvenu

    Interestingly, the precedent for an equally competent solution was set in 2008 when the cryptocurrency called bitcoins came online. Bitcoins are bits of code generated by complex mathematical calculations, and each is worth about $630 today. Using my public and private keys, I can perform bitcoin transactions, the records of which are encrypted and logged in a publicly maintained registry called the blockchain. Once the blockchain is updated with a transaction, no other information except the value exchanged can be retrieved. In April 2011, this blockchain was forked into a new registry for a cryptocurrency called namecoin. Namecoins and bitcoins are exactly the same but for one crucial difference. While bitcoins make up a decentralized banking system, namecoins make up a decentralized domain name system (DNS), a registry of unique locations on the Internet.

    The namecoin blockchain, like its website puts it, can “securely record and transfer arbitrary names,” or keys, an ability that lets programmers use it as an anonymizing layer to communicate between social network identities and third-party sites in the same way Crypto-Book does. For instance, OneName, a service that lets you use a social network identity to label your bitcoin address to simplify transactions, describes itself as

    a decentralized identity system (DIS) with a user directory made of entries in a decentralized key-value store (the Namecoin blockchain).

    Say I ‘register’ my digital identity with namecoin. The process of registration is logged on the blockchain and I get a public and private key. If Twitter is a relying partner, I should be able to log in to it with my keys and start using it. Only now, Twitter’s server will log me in but not itself own the username with which it can monitor my behavior. And unlike with OpenID or OAuth, neither namecoin or anyone on the web can access my identity because it has been encrypted. At the same time, like with Crypto-Book, namecoin will use OAuth to communicate with the social networking and third-party sites. But at the end of the day, namecoin lets me mobilize only the proof that my identity exists and not my identity itself in order to let me use services anonymously.

    If everybody’s wearing a mask, who’s anonymous?

    As such, it enables one of the most advanced anonymization services today. What makes it particularly effective is its reliance on the blockchain, which is not maintained by a central authority. Instead, it’s run by multiple namecoin users lending computing resources that process and maintain the blockchain, so there’s a fee associated with staking and sustaining your claim of anonymity. This decentralization is necessary to dislocate power centers and forestall precipitous decisions that could compromise your privacy or shut websites down.

    Services like IRC provided the zeroth level of abstraction to achieve anonymity in the presence of institutions like Facebook – by being completely independent and ‘unhooked’. Then, the OpenID protocol aspired, ironically, to some centrality by trying to set up one set of keys to unlock multiple doors. In this sense, the OAuth protocol was disruptive because it didn’t provide anonymity as much as tried to provide an alternative route by limiting the number of identities you had to maintain on the web. Then come the Crypto-Book and blockchain techniques, both aspiring toward anonymity, both reliant on Pyrrhic decentralization in the sense that the power to make decisions was not eliminated as much extensively diluted.

    Therefore, the move toward privatization of digital identities has been supported by publicizing the resources that maintain those identities. As a result, perfect anonymity becomes consequent to full participation – which has always been the ideal – and the size of the fee to achieve anonymity today is symptomatic of how far we are from that ideal.

    (Thanks to Vignesh Sundaresan for inputs.)

  • What is VLBI?

    On June 25, scientists announced the discovery of a trio of supermassive black holes at the center of a galaxy 4.2 billion light years away. The find was credited to the European VLBI Network. A Space.com report stated that this network “could see details 50 times finer than is possible with the Hubble Space Telescope”. How is this achieved?

    VLBI stands for Very-Long-Baseline Interferometry. It is a technique used in astrometry to obtain high resolution images of the sky using a network of telescopes instead of using one big telescope. VLBI is commonly used to image distant cosmic radio sources such as quasars.

    This sophisticated technique has its roots in 18th century physics, specifically in Thomas Young’s famous double-slit interference experiment in the early 1800s. When Young placed a screen with two extremely narrow slits in front of a light source, such as a burning candle, the shadow cast on the other side was actually an alternating patchwork of bright and dull bands. This was the interference pattern. Young’s experiment was important to establish that light travels as a wave, overturning Newton’s conviction that light was composed of particles.

    The interference pattern

    When light passes through each slit, it diffracts, i.e. starts to spread out. At some point in front of the slits, the diffracted waves meet and interfere. Where crest met crest, there was constructive interference and that resulted in a bright band. Where crest met trough, there was a duller band. Where trough met trough, there was a dark band. If the position of the slits was changed, the interference pattern also shifted.

    In VLBI, the candle is replaced by a distant source of radio waves, like a quasar. The slits are replaced by radio antennae on telescopes. Since the Earth is rotating, the antenna are in relative motion with the quasar. As a result, there is an interference between the signals being received by the two telescopes. This interference pattern is processed at a central location along with the time at which each signal was received at each antenna as recorded by a clock.

    In the second stage of this colossal Young’s experiment, let’s talk some wave physics. Radio waves have greater wavelength than visible light. As a result, radio telescopes have an inherently low angular resolution than optical telescopes of the same size. Angular resolution is defined as the ratio of an emission’s wavelength to the diameter of the telescope receiving it. Qualitatively, it describes the smallest unit of distance the telescope can distinguish in the image it receives and that must be as low as possible. For example, a 50-meter wide radio telescope will have an angular resolution of 50/0.01 = ~41.2 arc-second. An optical telescope of the same size will have an angular resolution of 0.004 arc-second, 10,000-times better.

    Baseline + Atomic clocks

    VLBI resolves this issue (this isn’t really a pun). Because there are multiple telescopes receiving the radio signals, the angular resolution is redefined: it’s no longer the ratio between the wavelength and the diameter of the telescope. It’s the ratio between the wavelength and the baseline. The baseline is the maximum physical separation between two telescopes in the array. If, say, the baseline is 1,000 km, the angular resolution of an array of radio telescopes becomes 0.002 arc-second, 20,000-times better.

    However, this technique couldn’t find wide implementation until the atomic clock was invented in the 1950s. Before they were around, a single metronome had to be connected to multiple telescopes with cables, which limited the baseline length. With atomic clocks, telescopes could be placed on different continents because the clocks were globally coordinated.

    So, a telescope receives a radio signal, a computer sticks a timestamp on it and sends it to the receiver. The receiver collates such data from different telescopes and creates the fringe pattern characteristic of interference. A processor finally recreates the source of all the radio waves at different locations using the fringe pattern and the times at which each signal was received. Of course, there are many systems in between to stabilize and improve the quality of the signal, to coordinate observations by the telescopes, etc., but the basic principle is the same as in Young’s experiment of two centuries ago.

  • Inspecting nuclear warheads like they were passwords

    Nuclear weapon inspectors have a weighty but tricky job. An inspecting state relies on them to verify if a weapon is a nuclear warhead, but the state whose weapons are being inspected doesn’t want to divulge too much information about the weapon’s design or performance. As David Cliff, a researcher at the Verification Research, Training and Information Center, London writes,

    In warhead dismantlement, the objective needs to be to gain as much confidence through agreed verification measures as possible, thereby minimizing the extent to which trust [between the two states] will need to become a factor… In fact, as a means of building trust and confidence between states, dismantlement is of limited value unless it occurs in a transparent and verifiable manner.

    (Emphasis mine)

    So, on the one hand, transparency is needed to ensure the number of warheads have been reduced. On the other, secrecy is necessary to keep the warheads from reaching the hands of potential adversaries, not to mention to ensure deterrence. Methods to measure sensitive information often include safeguards to protect it, adding another layer of liability.

    To simplify this process, researchers from the USA and UK have developed a new technique to verify warheads without needing any sensitive information about them – thus eliminating the need for them to be made available in the first place. They propose to bombard a supposed warhead with neutrons, then using a detector to check the properties of the particles that have passed through. Next, an actual known warhead is subjected to the same profiling.

    Then, an inspector randomly chooses to use each detector on other warheads that need to be inspected. Over multiple tests, the detector will be able to check with increasing likelihood if a warhead is genuine or not by comparing it to previous tests. Crucially, the inspector will not have access to any parameters of the comparison but only if a ‘Yes’ or a ‘No’ has been signalled.

    A paper describing this ‘zero-knowledge protocol’ appeared in Nature on June 26, in which the researchers argue,

    This technique will reveal no information about the composition or design of nuclear weapons when only true warheads are submitted for authentication, and so does not require an engineered information barrier.

    To assist in their analysis, the team used the unclassified British Test Object (BTO), which “does not contain special or other nuclear materials, but is used to develop and calibrate imaging systems for diagnostic analysis of nuclear weapons”. It consists of concentric rings of polystyrene, tungsten, aluminum, graphite and steel. Over multiple tests (i.e. simulations) on the BTO, the team then estimates the number of tests needed to reliably detect increasingly serious defects, finding 5,000 and 32,000 to be sufficient to detect the most serious ones.

  • A gamma ray telescope at Hanle: A note

    A gamma ray telescope is set to come up at Hanle, Ladakh, in 2015 and start operations in 2016. Hanle was one of the sites proposed to install a part of the Cherenkov Telescope Array, too. A survey conducted in the 1980s and 90s threw up Hanle as a suitable site to host telescopes because “it had very clear and dark skies almost throughout the year, and a large number of photometric and spectroscopic nights,” according to Dr. Pratik Majumdar of the Saha Institute of Nuclear Physics, Kolkata.

    The Cherenkov Telescope Array will comprise networked arrays of telescopes in the northern and southern hemispheres to study and locate sources of up to 100-TeV gamma rays. Dr. Subir Sarkar at Oxford University had told me at the time that “the CTA southern observatory will be able to study the center of the galaxy, while the northern observatory [of which the Hanle telescope will be a part] will focus on extra-galactic sources.” Another Cherenkov telescope, called HAGAR, has been in operation at Hanle since 2008, according to Dr. Majumdar.

    Artist's conception of the CTA once installed at one of its sites.
    Artist’s conception of the CTA once installed at one of its sites. Image: Pratik Majumdar/SINP

    Although Hanle was in the running around July 2013, its name was lifted from the list by April 2014. Dr Sarkar had written to me earlier,

    “I realize it is interesting to mention to your readers that Hanle, Ladakh is a proposed site. However I should tell you that this is very unlikely – not because the site is unsuitable (in fact it is excellent from the scientific point of view) but because the Indian Govt. does not permit foreign nationals to visit there. I know a French postdoc who was at TIFR for several years and is now working with Pratik Majumdar at SINP … even he has been unable to get clearance to go to Hanle! I do think India needs to be more proactive about opening up to people from abroad, especially in science and technology, in order to benefit from international collaboration. Unfortunately this is not happening!”

    This is ‘closedness’ showed up in another place recently: at the INO, Theni.

    Dr. Majumdar added,

    Almost all the research institutes and installations in India need to pull up their socks particularly in case of dealing with such bureaucratic procedures [of letting foreign scientists move around inside the country]. We do need to change this inhibitive attitude. BARC is another case where bringing in foreigners for work/visits is quite a big hassle and that is not just for foreigners, even any Indian national is not allowed to take laptops/CDs/other electronic items inside BARC without special permissions. This is unthinkable to me in today’s age. So, even though it does not sound very bad always, there are various layers of inhibition where at various levels this has to be fought.

    He added that HAGAR operated with similar restrictions. In fact, in 2018, another gamma-ray observatory is set to be installed in Hanle by TIFR and BARC. So we have local scientific institutions asking for more international participation and eager to deliver results, and on the other hand annoying bureaucratic restrictions on those who decide to participate.