GSLV D6 is a confidence booster

The GSLV Developmental-flight 6 launch by the Indian Space Research Organisation on August 27 was three things: the launch of the GSAT 6 satellite for the Indian military, the fifth successful launch of a GSLV rocket, and the second successful test-flight of the indigenous cryogenic upper-stage engine. The satellite is a two-tonne behemoth that’s too heavy for a PSLV rocket, whose maximum payload capacity to the geostationary transfer orbit is 1,410 kg, to heft – so the GSLV. And the cryogenic upper-stage enables the GSLV to lift a heavier payload: 2,500 kg to the geostationary transfer orbit.

But the most important takeaway lies in the big picture. This may be the fifth successful launch of the GSLV out of nine tries but it’s the second successive one. This may be the third successful flight with the cryogenic upper-stage but it’s the second successive one. And both accomplishments signify that ISRO’s scientists have been learning the right lessons from previous failures and that the GSLV is on the road to establishing reliability.

The previous successful test flight of the cryogenic engine was in January 2014, and the test before that in 2010 was a failure. While the PSLV rocket has four stages, of alternating solid and liquid ones, the GSLV Mk rockets that use the engine have three: solid, liquid and cryogenic stages. The solid stage is derived from the Nike-Apache engine of the US and the liquid stage, from the Vulcain engine of France. As a result of the extended legacies, it was easier for ISRO to adapt them for Indian rockets to use. However, the cryogenic engine had to be developed indigenously after the required tech. transfer from the Soviets fell apart in the 1980s due to political reasons.

With two successful flights in two years, the space agency now has reason to believe the engine could be finally past its teething troubles. And despite its intricate engineering, its success makes things simpler for ISRO. Before January 2014, ISRO was also considering a variety of Russian engines to power the GSLV Mk I’s and II’s upper-stage, all to no avail. Now it can focus on perfecting the cryogenic engine for the next big-picture milestone: at least two GSLV launches every year, signifying 4-5 tonnes equipment right there.

Note: This article was updated at 1.42 am on August 29, 2015, to say that the solid stage of the GSLV uses the S139 engine, not Nike-Apache, and that the liquid stage uses a modified Viking 4 engine, not the Vulcain.

Roundup of missed stories – August 26, 2015

Too many things to do at work this last week, so much so that I missed writing/blogging on a bunch of articles and papers that in other circumstances I’d have loved to discuss. Here they are, rounded up in the chance that you might find one of them interesting and consider taking the debate surrounding it to a larger audience.

  1. Something at the Milky Way’s centre survived an encounter with a black hole – “The G2 cloud in our Galaxy’s core has survived an encounter with the central black hole and failed to trigger a major flare-up in the black hole’s activity. A promising theory endeavours to explain the cloud’s nature.”
  2. What does the way Amazon supposedly treats its employees speak about the future of employment in the tech. industry? – “Bezos’s statement was promptly satirized in a hilarious piece by Andy Borowitz, which you really must read. Borowitz’s conceit is that Amazon mandates that anyone who is not acting compassionately to their fellow employees will be fired the next day. And then there is a thoughtful piece at Pando, which insists that we will soon be forced to choose between the hard-charging culture of the tech industry and the more humanistic values that we may privately prefer. I don’t know if we have to choose — I don’t know if we have a choice — but the point is well taken: At what point do we stand up and say, We don’t want to live this way?”
  3. Following criticism, PLOS removes blog defending scrutiny of science -Although the tools we use to ensure transparency can be abused, that’s a necessary risk, note Seife and Thacker:

    “To be sure, the same mechanisms that watchdogs use to uncover scientific wrongdoing have been abused in the past. Climate scientist Michael Mann, for instance, was subject to invasive and harassing requests for information via freedom of information laws, via judicial-branch powers, and via congressional requests. No doubt they will be abused in the future.

    But transparency laws remain a fundamental tool for monitoring possible scientific misbehavior. And it would be a mistake to believe that scientists should not be subject to a high level of outside scrutiny. So long as scientists receive government money, they are subject to government oversight; so long as their work affects the public, journalists and other watchdogs are simply doing their jobs when they seek out possible misconduct and questionable practices that could threaten the public interest.”

  4. A double-blind randomized clinical trial on the efficacy of magnetic sacral root stimulation for the treatment of Monosymptomatic Nocturnal Enuresis [bed-wetting] – “Both treatment and control groups were comparable for baseline measures of frequency of enuresis, and VAS. The mean number of wet nights/week was significantly reduced in patients who received real rSMS. This improvement was maintained 1 month after the end of treatment. Patients receiving real-rSMS also reported an improvement in VAS ratings and quality of life. A significant reduction of resting motor threshold was recorded after rSMS in the real group while no such changes were observed in the sham group.”
  5. Death metal in ancient oceans – “About 420 million years ago, near the end of the so-called Silurian period, the last of a series of mass extinctions struck the world’s oceans. Some scientists have suggested these die-offs were caused by worldwide cold spells. But a new study hints that the extinctions—which mostly affected corals, colonymaking creatures called graptolites, and eel-like creatures called conodonts—may have instead been caused by changes in ocean chemistry, including reduced oxygen and elevated concentrations of toxic metals dissolved in the seawater.”
  6. A new player in the well-contested field of atomic microscopy – “We introduce a scanning probe technique that enables three-dimensional imaging of local electrostatic potential fields with subnanometer resolution. Registering single electron charging events of a molecular quantum dot attached to the tip of an atomic force microscope operated at 5 K, equipped with a qPlus tuning fork, we image the quadrupole field of a single molecule. To demonstrate quantitative measurements, we investigate the dipole field of a single metal atom adsorbed on a metal surface.”
  7. Do we have the fusion reactor we need? Or is this another one of those premature promises? – “A privately funded company called Tri Alpha Energy has built a machine that forms a ball of superheated gas—at about 10 million degrees Celsius—and holds it steady for 5 milliseconds without decaying away. That may seem a mere blink of an eye, but it is far longer than other efforts with the technique and shows for the first time that it is possible to hold the gas in a steady state—the researchers stopped only when their machine ran out of juice.”
  8. Is it better to stay in the dark about our own genetic secrets? – “If you have a terminal illness that’s completely untreatable, you might genuinely be happier living your last months in ignorance. A diagnosis might allow you to seek treatment giving you an extra month of life, but if that extra month is riddled with fear and sadness, it might not be worth it. In these cases, it seems like ignorance really might be preferable.”

Conflicts in the Middle East are bringing down NOx emissions

Two years ago, a study in Science put a detailed analysis behind an idea that had already taken root on a lot of people’s minds: that the unfavourable weather conditions climate change was creating around the world could be related to the world’s growing tendency toward conflicts. The study’s authors weren’t saying that bad weather caused the Second World War but only that it would be legitimate to consider if the changing climate had an adverse impact on human neurophysiology. But setting aside the specifics, the study’s bigger accomplishment was in encouraging a more holistic view of climate change’s impact on humankind.

Now, another study, based on satellite observations and economic data from the World Bank, sets out a karmic inverse of that idea: that conflicts in the Middle East have led to cleaner air over the region.

The satellite data comes from Aura, which NASA launched in 2004 to make qualitative observations of Earth’s atmosphere. One instrument in the apparatus is the Ozone Monitoring Instrument (OMI) that measures variations in the ozone layer, and makes precision measurements of gases detrimental to the atmosphere across a 2,600-km field of view, which lets it log global data almost on a daily basis. And since around 2008, it has found that nitrogen dioxide emissions – released by burning fossil fuels – have been dropping over some cities in the Middle East and over Athens, Greece.

Specifically, OMI found that the density of nitrogen dioxide gas over Cairo, Athens, Tehran and Esfahan (Iran), Baghdad, Tikrit and Samarra (Iraq), the Palestinian territories, Beirut and Tripoli (Lebanon), and Damascus and Aleppo (Syria) bespeak a strong correlation with the shifting political climates in the region. The study describing the findings, published in Science Advances on August 21, 2015, is able to exclude natural variations because the trends were uniformly increasing until 2008-2010 – like in almost all places around world – before deviating significantly.

Since the Arab Spring in 2011 saw a popular uprising starting with Cairo and spreading out into the rest of the Middle East, the GDPs of all the involved economies shrank. Now, the OMI data presents the climatic impact of humanitarian crises and armed conflict – one that long-term projections of the impact of climate change haven’t factored in.

For example, the most significant changes are visible over Greece, Egypt, Iraq, Iran and Syria, as well as over Saudi Arabia and the United Arab Emirates. However, the latter two are discounted from the authors’ analysis for two reasons: the reversal in nitrogen dioxide emission trends over the two countries started before the region started to become turbulent, and began after 2006 and 2008 when the emirates and the kingdom enacted laws to reduce their carbon footprints.

In Egypt, on the other hand, the GDP rose by about 6% per year in 2005-2010 and then fell to 2% per year in 2011-2014. But OMI couldn’t spot any parallel decline in carbon dioxide, so the decline in nitrogen dioxide is being attributed to the reduction of vehicular emissions thanks to petrol becoming more expensive. In Greece, the economic recession caused nitrogen dioxide emissions to fall by 40% in the six years since 2008. In Iran, Tehran’s and Esfahan’s nitrogen dioxide emissions increased at 10% per year in 2005-2010 – as if the 2006 sanctions didn’t happen – but turned down to -4% per year since 2010, when the GDP also saw a sharp downturn by 2 percentage points before turning negative in 2012.

A and B) Tropospheric NO2 column density changes in 10^15 molecules/cm2 (A) between 2005 and 2010 and (B) between 2010 and 2014. Source: dx.doi.org/10.1126/sciadv.1500498
A and B) Tropospheric NO2 column density changes in 10^15 molecules/cm2 (A) between 2005 and 2010 and (B) between 2010 and 2014. Source: dx.doi.org/10.1126/sciadv.1500498

In Iraq, similar correlations between declining GDP and falling nitrogen dioxide emissions are observed over Baghdad, Mosul and Kirkuk, as well as additional declines over the cities of Tikrit and Samarra thanks to incursions by the Islamic State.

As the authors of the Science Advances article write, “such relatively short-term changes cannot be captured by air pollution emission inventories and future projections, including the Representative Concentration Pathways”. The RCPs are a set of projections used by the Intergovernmental Panel for Climate Change based on how much greenhouse gases are enforcing anthropogenic climate change. One of them, RCP4.5, assumes that NOx emissions in the Middle East will be constant from 2005 to 2030, and another, RCP8.5, that they’ll increase at the rate of 2% per year. OMI’s findings suggest these assumptions might be failing reality.

It also reveals how a better estimate of greenhouse gas emissions, as well as country-wise challenges, emerges when ground realities are combined with satellite-logged data. Consider the example of Lebanon, whose carbon dioxide emissions fell by 20% over 2011 and 2012, but whose nitrogen dioxide emissions spiked by 20-30% in 2014. When they probed further, the scientists realised it had to do with the influx of refugees from the Syrian civil war – 1.2 million of them, of which 350,000 fled to Beirut alone.

Featured image credit: magharebia/Flickr, CC BY 2.0.

GitHub hit by DDoS attack

The collaborative coding platform GitHub became the subject of a DDoS attack on August 25, its second this year after having been targeted by a massive attack in March. The issue first appeared at 3.05 pm IST, according to GitHub’s status log, when administrators began inspecting “connectivity problems”. By 4.08 pm, the issues were identified to be the result of a DDoS attack. At 4.36 pm, it was mentioned that the attack was ongoing. The last updates from GitHub said at 6.22 pm that normal service had been restored and that the situation was being monitored closely, and at 7.19 pm that everything was “operating normally”.

DDoS stands for distributed denial-of-service, where thousands of IP addresses – often spoofed – ping a target IP and force it to respond. A ping, according to SC Magazine, is “a type of networking utility that determines whether or not a host is reachable, and how long it takes to be reached”. It’s a very small packet of data that, if echoed back by the target, signals that the target IP is live. But with a swarm of pings, the effect over time is that the target IP address is brought down, or crashes, unable to handle the traffic.

For the people perpetrating the attack, the intent is to make the target address unavailable to legitimate users. At 5.40 pm, the Norse Corp map of live DDoS attacks identified two prominent target locations in the United States, one of which (around the Missouri-Illinois-Iowa area) was the subject of an intense assault from the South East Asian and south European regions.

A map of ongoing DDoS attacks. Source: Norse Corp
A map of ongoing DDoS attacks. Source: Norse Corp

GitHub has been the subject of multiple DDoS attacks in its history. The platform is effectively a collection of repositories, or projects that developers are working on, and attackers miffed by the contents of individual repositories often take down the entire site. It was on the back of similar concerns that the Chinese government blocked GitHub in China in January 2013, and the Indian government for a short while in December 2014.

While DoS attacks have been around since the 1990s, DDoS attacks kicked in in 2000, with one of the first targets being Yahoo!. The difference is that DoS attacks originate from a single source while DDoS attacks are distributed across multiple sources. They’re also impossible to anticipate, very difficult to defend against, and very difficult to track down. Attackers have been known to go after all kinds of online services – from banks to government sites to gaming tournaments. According to an Akamai report released earlier this month, India is the fourth largest target of DDoS attacks worldwide, accounting for 7.43% of all attacks. Interestingly, as Trak.in writes, “China is … both the largest source and target of attacks on web applications”.

Curious Bends – tumour twin, ethical non-vegetarians, fixing Indian science, and more

Apologies for the unplanned summer holiday, but we’re back!

1. Was the tumour inside her brain her twin? (Audio)

She moved from Hyderabad to do her PhD at Indiana University and began​ ​experiencing headaches and suffering from​ ​sleep disorders. Co-workers​ ​and friends would speak to her, only for the sentences to get all​ ​garbled. She was in excruciating pain. What was this tumour that was​ ​growing inside her brain? Why was it wreaking havoc in her life? What​ ​if what was growing inside her head had a life of its own? (audiomatic.in, 13 min listen)

2. An India-born Nobel laureate’s solutions for fixing science in India 

“Venkatraman Ramakrishnan is a biologist—even though he won the Nobel Prize in chemistry in 2009—and an Indian at heart, even though he has spent most of his life in the US and the UK where his work led to the prize. His career has been unusual, just as his achievements. In December, he is going to take his new position as the president of the Royal Society, the world’s oldest and most esteemed scientific society. He will be the first non-white president in its 350-year history, and he has already made plans to invigorate scientific ties between India and the UK.” (qz.com, 7 min read)

3. The only ethical way to eat meat: become scavengers

“The first and less realistic way is to replace hunting with scavenging. Scavenging for wild animals is a non-exploitative method of obtaining animal flesh. A more achievable and safer option would be to do something closer to agriculture as we now know it: domesticate the scavenger hunt. That is, raise animals—preferably ruminants—on limited pasture with the utmost attention to their welfare, allow them a life free of human exploitation, feed them natural diets in appropriate habitats, allow them to die a natural death, and then, and only then, consume them.” (psmag.com, 7 min read)

4. The woman who could stop climate change

“I asked what would happen if the emissions line did not, in fact, start to head down soon. Tears welled up in her eyes and, for a moment, Christiana Figueres, the head of United Nations Framework Convention on Climate Change, couldn’t speak. “Ask all the islands,” she said finally. “Ask Bangladesh. We just can’t let that happen. Do we have the right to deprive people of their homes just because I want to own three SUVs? It just doesn’t make any sense. And it’s not how we think of ourselves. We don’t think of ourselves as being egotistical, immoral individuals. And we’re not. Fundamentally, we all have a morality bedrock. Every single human being has that.”” (newyorker.com, 25 min read)

5. Although patents were designed to promote innovation, they don’t

“The public-good position on patents is simple enough: in return for registering and publishing your idea, which must be new, useful and non-obvious, you get a temporary monopoly—nowadays usually 20 years—on using it. This provides an incentive to innovate because it assures the innovator of some material gain if the innovation finds favour. It also provides the tools whereby others can innovate, because the publication of good ideas increases the speed of technological advance as one innovation builds upon another. But a growing amount of research in recent years suggests that, with a few exceptions such as medicines, society as a whole might even be better off with no patents than with the mess that is today’s system.” (economist.com, 15 min read)

Chart of the week

“By analysing global migration trends among professionals, the social network found India ended 2014 with 0.23% fewer workers than the beginning of the year. This represents the biggest loss seen in any country it tracked, according to LinkedIn.” (qz.com, 2 min read)

Countries to which Indian professionals are migrating. Source: Quartz
Source: Quartz

Not all waterworlds can host life

During its formation, Venus was in the Solar System’s habitable zone – much like Earth is now. Scientists think its surface contained liquid water, and its atmosphere was somewhat like Earth’s. Maybe there was life, too. However, as the levels of carbon dioxide kept increasing, its atmosphere became opaque, trapping most of the heat reflected by its surface, and Venus heated up to the point where its oceans boiled away. Today, life on the planet’s waterless surface is considered unlikely, except perhaps by those who’ve read a November 2014 study involving supercritical carbon dioxide, and those who believe in Hell.

Why can’t this be the case on alien worlds possessing water as well? Discoveries made since the mid-1990s – especially by the Kepler space telescope and probes in the Jovian and Saturnian systems – have unearthed a variety of worlds that could, or do, have liquid water on or below the surface. On Earth, life has been found wherever liquid water has been found, so liquid water on other planets and moons gets scientists excited about the possibility of alien life. Recent discoveries of a subsurface ocean on Europa and possibly on some other moons of Jupiter and Saturn have even prompted NASA to plan for a probe to Europa in the mid-2020s.

A study published online (paywall) in the Monthly Notices of the Royal Astronomical Society applies the brakes on that excitement to some extent. A kind of exoplanet which scientists think could host lots of liquid water—some 100-times the amount of water on Earth, in fact— are the so-called ‘waterworlds‘. They would have oceans so deep and wide that, according to the study, their effects on themselves and the planet’s climate would be incomparable to that on Earth – and altogether might not be hospitable to life the way we know liquid water can usually be.

The study’s authors write, “One important consequence is, for example, the formation of high-pressure water ice at the bottom of the ocean, which prevents the immediate contact of the planetary crust with the liquid ocean.” This in turn mutes the carbon-silicate cycle, a recycling of carbon and silicon compounds on the ocean floor that determines how much carbon dioxide is released from the oceans into the atmosphere.

The authors calculate that on an (at least) Earth-sized waterworld in the habitable zone of its star, there can be 25-100 Earth oceans for temperatures ranging from the freezing point of water to just beyond the boiling point. So a colder planet, say at 0° C, would have a smaller ocean and lesser liquid water to be able to absorb the carbon dioxide (and its absorptive capabilities can’t ‘power up’ without the carbon-silicate cycle). Yet, at lower temperatures the oceans are able to dissolve more gases, even as the pressure exerted by the gas on the ocean’s surface is higher. So a colder planet with a smaller ocean will dissolve more carbon dioxide from the atmosphere – turning the planet even cooler.

Similarly, a warmer waterworld will be able to absorb less carbon dioxide, letting the greenhouse gas accumulate in the atmosphere, heat the surface up and eventually boil the oceans away (like on Venus). In short, a waterworld whose temperatures are outside a specific range will become hotter if it’s warm and even colder if it’s cold. These runaway effects can occur pretty quickly, too. 

Based on the chemical properties of water and carbon dioxide, the scientists estimate that the life-friendly temperature range is from 273 K to 400 K (0° to 127° C). And even in this range, there could be threats to life in the form of ocean acidity. On Earth, limestone that’s in contact with water dissolves and keeps the water’s acidity in check, but this may not be happening on waterworlds where large landmasses could be a rarity or relatively smaller in size.

At the same time, these pessimistic speculations are offset by some assumptions the scientists have made in their study. For example, they assume that the waterworld doesn’t have tectonic activity. Such activity on Earth involves the jigsaw of landmasses grindings against each other, sometimes subducting one below the other to push down some minerals while volcanoes in other areas spew out others—in all making for a giant geological cycle that ensures the substances needed to sustain life are constantly replenished. If a waterworld were to have tectonic activity, it would also influence the carbon-silicate cycle and keep a runaway greenhouse effect from happening.

On Earth, the warming of the oceans presents a big problem to climatologists partly because its mechanisms and consequences are not fully understood – and more so to marine creatures. And as the oceans are able to dissolve more anthropogenic carbon dioxide, they also become more acidic. Yet, the effects are relatively smaller (ignoring the presence of life for a moment) compared to that on waterworlds – comprising no above-sea-level landmasses and infinite seas 100 km deep.

Featured image credit: Lucianomendez/Wikimedia Commons, CC BY-SA 4.0.

The Wire
August 23, 2015

ACAT in the wild

The software working behind the robotic voice of Stephen Hawking was released for public use on August 18 by Intel, the company that developed it. Although principally developed for Hawking, the ‘tool’ has since been made available to many other people suffering from motor neurone disease, an ailment that gradually but steadily deadens the neurons that control various muscles of the body, rendering its victims incapable of, say, moving their cheek muscles to elicit speech. Intel’s software, called the Assistive Context-Aware Toolkit (ACAT), steps in to translate visual signals like facial twitches to speech. Its source code and installation instructions are available on GitHub.

ACAT is an assembly of components that each perform a unique function. In the order of performance: an input device picks up the visual signals (cheek muscle twitches, in Hawking’s case), a calibrated text-prediction tool generates the corresponding unit of language, and a speech synthesiser vocalises the text. The first two components are unified by the Windows Communication Framework. In ACAT’s case, the text-prediction is performed by a tool called Presage, developed by Italian developer Matteo Vescovi. Other input tools include proximity sensors, accelerometers and buttons.

According to the BBC, the UK’s MND Association has celebrated the release. Of the motives behind it, Intel wrote, “Our hope is that, by open sourcing this configurable platform, developers will continue to expand on this system by adding new user interfaces, new sensing modalities, word prediction and many other features.” Company spokesperson Lama Nachman also noted that, with the current release, Intel isn’t anticipating ‘all kinds’ of innovations as much as assistive ones. A detailed user guide is available here.

A stinky superconductor

The next time you smell a whiff of rot in your morning’s eggs, you might not want to throw them away. Instead, you might do better to realise what you’re smelling could be a superconductor (under the right conditions) that’s, incidentally, riled up the scientific community.

The source of excitement is a paper published in Nature on August 17, penned by a group of German scientists, describing an experiment in which the compound hydrogen sulphide conducts electricity with zero resistance under a pressure of 90 gigapascals (about 888,231-times the atmospheric pressure) – when it turns into a metal – and at a temperature of 203.5 kelvin, about -70.5° C. The discovery makes it an unexpected high-temperature superconductor, doubly so for becoming one under conditions physicists don’t find too esoteric.

The tag of ‘high-temperature’ may be unfit for something operating at -70.5° C, but in superconductivity, -70.5° C approaches summer in the Atacama. When the phenomenon was first discovered – by the Dutch physicist Heike Kamerlingh Onnes in 1911 – it required the liquid metal mercury to be cooled to 4.2 kelvin, about -269° C. What happened in those conditions was explained by an American trio with a theory of superconductivity in 1957.

The explanation lies in quantum mechanics, where all particles have a characteristic ‘spin’ number. And QM allows all those particles with integer spin (0, 1, 2, …) to – in some conditions – cohere into one bigger ‘particle’ with enough energy of itself to avoid being disturbed by things like friction or atomic vibrations*. Electrons, however, have half-integer (1/2) spin, so can’t slip into this state. In 1957, John Bardeen, Leon Cooper and Robert Schrieffer proposed that at very low temperatures – like 4 K – the electrons in a metal interact with the positively charged latticework of atoms around them to pair up with each other. These electronic pairs are called Cooper pairs, kept twinned by vibrations of the lattice. The pair’s total spin is 1, allowing all of them to condense into one cohesive sea of electrons that then flows through the metal unhindered.

The BCS theory soon became a ‘conventional’ theory of superconductivity, able to explain the behaviour of many metals cooled to cryogenic temperatures. The German team’s hydrogen sulphide system is also one such conventional scenario – in which the gas had to compressed to form a metal before its superconducting abilities were teased out.

The team, led by Mikhail Eremets and Alexander Drozdov from the Max Planck Institute for Chemistry in Mainz, first made its claims last year, that under heavy pressure hydrogen sulphide becomes sulphur hydride (H2S → H3S), which in turn is a superconductor. At the time their experiment showed only one of two typical properties of a superconducting system, however: that its electrical resistance vanished at 190 K, higher than the previous record of 164 K.

Their August 17 paper reports that the second property has since been observed, too: that pressurised hydrogen sulphide doesn’t allow any external magnetic field to penetrate beyond its surface. This effect, called the Meissner effect, is observed only in superconductors. For Eremets, Drozdov et al, this is the full monty: a superconductor functioning at temperatures that actually exist on Earth. But for the broader scientific community, the paper marks the frenzied beginning of a new wave of experiments in the field.

Given the profundity of the findings – of a hydrogen-based high-temperature superconductor – they won’t enter the canon just yet but will require independent verification from other teams. A report by Edwin Cartlidge in Nature already notes five other teams around the world working on replicating the discovery. If and when they succeed, the implications will be wide-ranging – for physics as well as historical traditions of physical chemistry.

The BCS theory of superconductivity provided a precise mechanism of action that allowed scientists to predict the critical temperature (Tc) – below which a material becomes superconducting – of all materials that abided by the theory. Nonetheless, by 1957, the highest Tc reached had been 10 K despite scientists’ best efforts; so great was their frustration that in 1972, Philip Warren Anderson and Marvin Cohen predicted that there could be a natural limit at 30 K.

However, just a few years earlier – in 1968 – two physicists, Neil Ashcroft and Vitaly Ginzburg, refusing to subscribe to a natural limit on the critical temperature, proposed that the Tc could be very high in substances in which the vibrations of the atomic latticework surrounding the electrons was pretty energetic. Such vigour is typically found in the lighter elements like hydrogen and helium. Thus, the Ashcroft-Ginzburg idea effectively set the theoretical precedent for Eremets and Drozdov’s work.

But between the late 1960s and 2014, when hydrogen sulphide entered the fray of experiments, two discoveries threw the BCS theory off kilter. In 1986, scientists discovered cuprates, a class of copper’s compounds that were superconductors at 133 K (at 164 K under pressure) but didn’t function according to the BCS theory. Thus, they came to be called unconventional superconductors. The second discovery was of another class of unconventional superconductors, this time in compounds of iron and arsenic called pnictides, in 2008. The highest Tc among them was less than that of the cuprates. And because cuprates under pressure could muster a Tc of 164 K, scientists pinned their hopes on them of breaching the room-temperature barrier, and worked on developing an unconventional theory of superconductivity.

But for those choosing to persevere with the conventional order of things, there was a brief flicker of hope in 2001 with the discovery of magnesium diboride superconductors: they had a Tc of 39 K, an important but not very substantial improvement on previous records among conventional materials.

The work of Eremets & Drozdov was also indirectly assisted by a group of Chinese researchers in 2014, who were able to anticipate hydrogen sulphide’s superconducting abilities using the conventional BCS theory. According to them, hydrogen sulphide would become a metal under the application of 111 gigapascals of pressure, with a Tc between 191 K and 204 K. And once it survives independent experimental scrutiny intact, the Chinese theoretical work will prove valuable as scientists confront their next big challenge: pressure.

The ultimate fantasy would be to have a Tc is in the range of ambient temperatures. Imagine leagues of superconducting cables radiating out from coal-choked power plants, a gigawatt of power transmitted for a gigawatt of power produced**, or maglev trains running on superconducting tracks at lower costs and currents, or the thousands of superconducting electromagnets around the LHC that won’t have to be supercooled using jackets of liquid helium. Sadly, that Eremets & Drozdov have (probably) achieved a Tc of 203.5 K doesn’t mean that the engineering is accessible or affordable. In fact, what allowed them to fetch 203.5 K is what the barrier is for the tech to be ubiquitously used, making their feat an antecedence of possibilities rather than a demonstration itself.

It wasn’t possible until the 1970s to achieve pressures of a few gigapascals in the lab, and similar processes today are confined to industrial purposes. A portable device that’d sustain that pressure across large areas is difficult to build – yet that’s when metallic sulphur hydride shows itself. In their experiment, Eremets and Drozdov packed a cold mass of hydrogen sulphide against a stainless steel gasket using some insulating material like teflon, and then sandwiched the pellet between two diamond anvils that pressurised it. The diameter of the entire apparatus was a little more than a 100 micrometers across. Moreover, they also note in their paper that the ‘loading’ of the hydrogen sulphide between the anvils needs to be done at a low temperature – before pressurisation – so that the gas doesn’t decompose before the superconducting can begin.

These are impractical conditions if hydrogen sulphide cables have to be handled by a crew of non-specialists and in conditions nowhere near controllable enough as the insides of a small steel gasket. As an alternative, should independent verification of the Eremets & Drozdov experiment happen, scientists will use it as a validation of the Chinese theorists’ calculations and extend that to fashion a material more suited to their purposes.

*The foundation for this section of QM was laid by Satyendra Nath Bose, and later expanded by Albert Einstein to become the Bose-Einstein statistics.

**But not a gigawatt of power consumed, thanks to power thefts to the tune of Rs.2.52 lakh crore.

The chemistry of the Tianjin warehouse blasts

The Tianjin warehouse blasts on August 12 caused a weak surface quake in the Chinese port city, made houses uninhabitable in a 2-km radius, absolutely decimated a parking lot of 8,000 cars in its vicinity, blew a crater below the warehouse, released a very-toxic chemical into the surrounding air and water, and consumed the lives of 114 people and counting. The structure was known to contain hazardous substances, and claims have surfaced that the company managing them – Ruihai International Logistics – was transporting obscene quantities illegally.

But beyond the illegality itself, the explosions had an underlying chemistry whose strength far outstripped the weaknesses of structures in the vicinity being susceptible to shockwaves, a strength that exacerbated the government’s failure in not clamping down on Ruihai earlier, in allowing residential settlements in the neighbourhood of such chemicals and, in the aftermath, not reacting swiftly enough to either allegations of cover-ups as well as informing the media of what had gone up in flames. According to one video (link now dead) on weibo, the Chinese microblogging website, one of the principal explosions occurred after the firefighters arrived, implying that the destruction was drawn out by the already-burning warehouse being sprayed with water by authorities in the know.

For instance, the structure was known to contain ammonium nitrate, potassium nitrate and calcium carbide, as well as sodium cyanide. Of these compounds, ammonium nitrate and calcium carbide are known to be explosive – but for different reasons – while the rest are deadly in their own right.

Ammonium nitrate

Simply put, ammonium nitrate is the Liam Neeson of fertilisers but also the Ra’s al Ghul of explosives. It was used as the explosive material in the 2013 Hyderabad blasts. Each molecule contains two atoms of nitrogen, three of oxygen and four of hydrogen. The way the atoms are bonded, the molecule as a whole is eager to give away an oxygen atom to any other molecule reacting with it because doing so would send it a stable, less energetic state. This eagerness is exemplified by the molecule’s low sensitivity to physical shock and heat: just a little jerk or heating will blow it up, generating a shockwave at 5,270 m/s – more than five times the speed of sound.

What could make such a shockwave even more powerful has to do with another property of ammonium nitrate. If not stored in tightly sealed containers, the compound gradually absorbs moisture from the atmosphere and coalesces into solid lumps. In the Tianjin warehouse, then, large quantities of ammonium nitrate could’ve become moist and formed proximate clumps, and when one section of those clumps got heated, it blew up and generated a shockwave that shot the rest of it to Hell as well.

Calcium carbide

By itself, calcium carbide is mostly harmless. But should you spray it with water, it reacts to produce calcium hydroxide (slaked lime) and acetylene. When acetylene is burnt in the presence of oxygen, it produces a flame of 3,600 K, hot enough to melt a metal as sturdy as tungsten – so its application in welding. And like ammonium nitrate, acetylene is susceptible to shockwaves, especially if the surrounding pressure goes beyond 103,421 pascals (almost equal to the atmospheric pressure at sea level). In such situations, it explodes into its constituent atoms – carbon and hydrogen.

And here it gets worse: hydrogen burns violently with… well, the atmosphere. Remember the reactor-3 explosion during the Fukushima disaster in 2011? That wasn’t the work of any nuclear substance as much as leaked hydrogen.

Potassium nitrate and sodium cyanide

While potassium nitrate is used in the preparation of gunpowder, it’s explosive when reacting with reducing agents – i.e. electron-donators. The incredibly poisonous sodium cyanide on the other hand isn’t considered explosive. However, an explosive derivative presents itself. The compound is hygroscopic, absorbing moisture from the atmosphere to form sodium formate and ammonia, and in sufficiently high quantities, ammonia reacts explosively with air, especially if heated up to 200° C.

Sodium cyanide is also wildly toxic, and already reports have emerged that it has been found in “nearby drains after the blasts“. It is very soluble in water, and if it enters the body in quantities as small as 3 mg/kg, it rapidly knocks out the lungs’ ability to take in oxygen and results in death. The Associated Press claims “several hundred tons” of it were present in the warehouse (according to one estimate, 700 tons, a decidedly unholy quantity within a kilometre’s radius of residential dwellings, and enough to poison 91% of all Asia to death*).

 

*

According to a BBC report, China is the world’s largest consumer of hazardous chemicals, and so can claim expertise in the storage and transport of large quantities of chemicals and no ignorance of how the warehouse came to store over 3,000 tonnes of the chemicals with the nearest houses less than a kilometre away. The chemicals themselves are frequently used in the metals industry, for the manufacture of synthetic substances, and for extracting some precious metals from their ores.

No wonder then that some Ruihai employees have been arrested, as well as a former government officials who’d served in the area called in for questioning. Beyond the difficulty of cleaning up an area made more potent by the addition of water, the bigger challenge facing the Chinese government is the chemistry itself: without a license to handle hazardous substances between October 2014 and June 2015, how did Ruihai amass the ammonium nitrate, calcium carbide, potassium nitrate and sodium cyanide, and why?

*Assuming the average body mass of an Asian adult is 57.7 kg, and with “people” referring only to Asian adults.

The Wire
August 18, 2015

DNA Bill uploaded for feedback

The Department of Biotechnology, under the Ministry of Science & Technology, is soliciting feedback on the Human DNA Profiling Bill, a scanned copy of which has been uploaded to the DBT website – accessible here. It is dated June 9, 2015, and is accompanied by some handwritten corrections. Public feedback is being solicited after the Bill was slated to be introduced in the monsoon session of the Parliament. However, the introduction eventually didn’t happen at all thanks to washed out sessions.

The last date for submitting feedback is August 20, 2015, at this email address.

On July 24, The Wire had reported on numerous shortcomings in the draft Bill, largely concerning the lack of accountability and privacy safeguards, as well as the absence of any financial memoranda. While a government representative – Dr. J. Gowrishankar, director of the Centre for DNA Fingerprinting and Diagnostics – had responded to our criticisms on July 25, he nonetheless didn’t mention if the draft Bill would or wouldn’t be modified in response to the issues we had raised.

However, the June-9 version of the Bill on which feedback from the people has been solicited differs from the working draft we had used – dated January 16, 2015.

In the new version, the table of contents and preamble aren’t included; Gowrishankar had previously noted that the Bill would be tabled without the preamble. However, it’s unclear why the table of contents was left out, too, apart from having uploaded a scanned version of the Bill.

Anyway, such minor changes have been made throughout the Bill – although a few significant changes stand out as well. For instance, Section 12(k) of the working draft has been excluded from the new version, that the supervising DNA Profiling Board will be “making recommendations for maximising the use of DNA techniques and technologies in administration of justice”.

For another, the self-contradictory Section 14(2) of the draft Bill has been removed in the new draft, i.e. that DNA profiling labs in already in existence at the time of passing the Bill needn’t get approval to perform tests. Now, all labs – no matter how old or new – will require the Board’s permission to serve the Bill’s interests.

While largely well-intentioned, the older draft Bill lacked watertight safeguards against the abuse of the DNA profiles that’d be stored in the database. Specifically, it abdicated the responsibility of defining best practices for extracting the profiles, didn’t define any operational costs, didn’t factor in any of the privacy-related course corrections suggested by the 2012 Report of the Group of Experts on Privacy, provided for no anonymisation protocols, and vested too many powers in the overseeing Board.

With the removal of Section 12(k), the new draft gives the Board a less self-indulgent ambit, even if the drafting committee hasn’t gone farther than that to ensure there will be independent regulatory oversight. In a previous conversation, Gowrishankar had said that such oversight would stem by default from the Parliament, but the just-concluded monsoon session illustrates how important decisions concerning the database could be delayed simply because MPs are distracted by other commitments.

Just as well, the Group of Experts’ privacy recommendations are also still missing. Without them, the Bill doesn’t do the following things, even as they’ve come to be recognised as important limbs of an effective privacy law around the world.

  1. Provide a notice that DNA samples were collected from so-so areas of the body
  2. Inform anybody – particularly the individual – if and when her/his DNA is contaminated, misplaced or stolen
  3. Inform a person if a case involving her/his DNA is pending, ongoing or closed
  4. Inform the people when there are changes in how their DNA is going to be accessed, or if the way their DNA is being stored or used is changed
  5. Distinguish between when DNA can be collected with consent and when it can’t
  6. Say how volunteers can contribute their DNA to the database even though the draft Bill has a provision for voluntary submissions
  7. Provide any explicit guarantee that the collected DNA won’t be used for anything other than circumstances specified in the Bill
  8. Specify when doctors or the police can or can’t access DNA profiles

The new draft also contains a new provision – under Section 24(5) – that DNA profiles’ databases will set up in individual states as well without saying if the same safeguards that apply to the national repository will apply to the regional ones. So, as a result of all these omissions the new draft Bill remains, like its previously availed version, in a suboptimal state. But while it is odd that the draft was opened up for public feedback after it was set to to be introduced in Parliament – it usually happens before – it has been opened now, until August 20.

The Wire
August 18, 2015

AT&T, the weakest link

In the throng of American companies and their confused compliance with the National Security Agency’s controversial decade-long snooping on internal and international communications, The New York Times and ProPublica have unravelled one that actually bent over backwards to please the NSA: AT&T. The basis of their allegations is a tranche of NSA documents detailing the features and scope of AT&T’s compliance with the agency’s ‘requests’, dating from 2003 to 2013.

The standout feature of the partnership is that, according to a note from AT&T, it wasn’t contractual, implying the ISP hadn’t been coerced into snooping and sharing data on the traffic that passed through its domestic servers. As ProPublica writes, “its engineers were the first to try out new surveillance technologies invented by the eavesdropping agency”. One of the documents even goes as far as to “highlight the Partner’s extreme willingness to help with NSA’s SIGINT and Cyber missions”.

The documents were part of those released by whistleblower Edward Snowden in 2013. According to the reporters, the three entities implicated in them – NSA, AT&T and Verizon – refused to discuss the findings, in keeping with what has become a tradition of various ISPs refusing to reveal the terms of their ‘collaborations’ and the NSA refusing to reveal the ISPs it did work with. Since Snowden released the documents in 2013, public ire against the government’s intrusive snooping programmes have increased even as President Barack Obama as well as the judiciary have been in agreement that revealing any more details than Snowden already had would threaten national security.

As a result, the news that AT&T didn’t bother challenging the NSA throws valuable light on how the agency was able to eavesdrop on foreign governments and international organisations.

The ISPs aren’t named but are referred to by code names, but their real identities were given away when the dates of some of their surveillance ops coincided, sometimes too perfectly, with dates on which some fibre optic cables were ‘repaired’. For example, a document dated August 5, 2011, talks about Fairview’s data-logging resuming over a cable damaged by the earthquake near Japan in the same year – while, ProPublic states, a “Fairview fiber-optic cable … was repaired on the same date as a Japanese-American cable operated by AT&T”. So, the Fairview programme was found to be NSA + AT&T and the Stormbrew programme, NSA + Verizon/MCI.

However, AT&T got more attention than Stormbrew. In 2011, the NSA spent $188.9 million on AT&T and less than half that on Verizon, possibly because the former also practiced peering, a technique in networking where one company relays data through the network on behalf of other companies. As a result, users’ data from other ISPs and TSPs also ended up going through the wired AT&T servers.

AT&T’s complicity dates back to the mid-1980s, when antitrust regulators broke up the monopolistic Ma Bell telephone company, a fragment of which was AT&T. Its formation roughly coincided with NSA’s launching the Fairview program into which the TSP got subsumed. Following the 9/11 attacks, both Fairview and Stormbrew assumed centre-stage in the agency’s anti-terrorism programmes, with Fairview being especially effective. As the Times writes, “AT&T began turning over emails and phone calls ‘within days’ after the warrantless surveillance began in October 2001”.

All the documents disclosed by the publications in the latest release are available here.

The Wire
August 16, 2015

Climatic fates in the ooze

While governments scramble to provide the laziest climate-change commitments ahead of the UN conference in Paris later this year, the world is being honed to confront how life about land will change as the atmosphere and surface and heat up. But for another world – a world that has often shown up its terran counterpart in sheer complexity – scientists are far from understanding how things will change over the next 85 years.

Climatologists and oceanographers were only recently able to provide a rounded explanation for why the rate of global warming slowed in the late 1990s – and into the 2010s: because the Pacific Ocean was absorbing heat from the lower atmosphere, and then palming it off to the Indian Ocean. But soon after the announcement of that discovery, another team from the US armed with NASA data said that the rate of warming hadn’t slowed at all and that it seemed that way thanks to some statistical anomalies.

Irrespective of which side is right, the bottomline is that our understanding of the oceans’ impact on climate change is poorly understood. And although it hasn’t been for want of trying, a new study in the journal Geology presents the world’s first map of what rests on the oceans’ floors – a map that’s been updated comprehensively for the first time since the 1970s.

The ocean floor is in effect a graveyard of all the undersea creatures that have ever lived, but the study’s significance for tracking climate-change lies with the smallest of those creatures – the tiny plankton, inhabitants of the bottommost rungs of the oceanic food chain. Their population on the surface and pelagic zones of the oceans increases with the abundance of silica and carbon, and when they die or the animals that eat them die, the float into the abyss – taking along a bit of carbon with them. This is the deceptively simple mechanism called the biological pump that allows the world’s larger waterbodies to absorb carbon dioxide from Earth’s atmosphere.

Digital map of major lithologies of seafloor sediments in world’s ocean basins. Source: doi: 10.1130/G36883.1
Digital map of major lithologies of seafloor sediments in world’s ocean basins. Source: doi: 10.1130/G36883.1

The new map, made by scientists from the University of Sydney and the Australian Technology Park, shows that contrary to popular beliefs, the oceanic basins are not settled by broad bands of sediments as much as there are pockets of them, varying in size and abundance due to a variety of surface characteristics and with the availability of certain minerals.

A photomontage of plankton. Credit: Kils/Wikimedia Commons, CC BY-SA
A photomontage of plankton. Credit: Kils/Wikimedia Commons, CC BY-SA

For example, diatom ooze – not watery eidolons of muck sticking to the underside of your shoe but crystalline formations composed of minerals and the remains of calcium- and silica-based plankton called diatoms – is visible in widespread patches (of light-green in the map) throughout the Southern Ocean, between 60º and 70º S.

The ooze typically forms in the 0.8-8º C range at depths of 3.3-4.8 km, and is abundant in the new map where the temperatures range from 0.9º to 5.7º C. Before this map came along, oceanographers – as well as climatologists – had assumed these deposits to be lying in continuous belts, like large undersea continents. But together with the uncertainty in data about the pace and quanta of warming, scientists had been grappling with a shifting image of climate change’s effects on the oceans.

The locations of diatom ooze also contribute to a longstanding debate about if the ooze settles directly below the largest diatom populations. According to the Australian study’s authors, “Diatom ooze is most common below waters with very low diatom chlorophyll concentration, forming prominent zones between 50° S and 60° S in the Australian-Antarctic and the Bellinghausen basins”. The debate’s origins lie in the common use of diatoms to adjudicate water quality: some species proliferate only in clean water, some in polluted water, and there many species of them differentiated by other preferred environments – saline, acidic, warm, etc.

The relative abundance of one species of plankton over the other could, for example, become a reliable indicator of another property of the water that scientists have had trouble measuring: acidity. The dropping pH levels in the oceans are – or could be – a result of dissolving carbon dioxide. While some may view the oceans as great benefactors for offsetting the pace of warming by just a little bit, the net effect for Earth has continued to be negative: acidic waters dissolve the shells of molluscs faster and could drive populations of fishes away from where humans have set up fisheries.

Ocean acidification’s overall effect on the global economy could be a loss of $1 trillion per year by 2100, a UN report has estimated – even as a report in the ICES Journal of Marine Science found that 465 studies published between 1993 and 2014 sported a variety of methodological failures that compromised their findings – all of precise levels of acidity. The bottomline, as with scientists’ estimates of the rate of pelagic warming, is that we know that the oceans are acidifying but are unsure of by how much.

The new map thus proves useful to assess how different kinds of ooze got where they are and their implications for how the world around them is changing. For example, as the paper states, “diatom oozes are absent below high diatom chlorophyll areas near continents”, where sediments derived from the erosion of rocks provides a lot of nutrients to the oceans’ surfaces – in effect describing how a warming Earth posits a continuum of implications for contiguous biospheres.

The Wire
August 13, 2015