Using a network of telescopes scattered across the globe, including the Danish 1.5-m telescope at ESO La Silla (Chile), astronomers discovered a new extrasolar planet significantly more Earth-like than any other planet found so far. The planet, which is only about 5 times as massive as the Earth, circles its parent star in about 10 years. It is the least massive exoplanet around an ordinary star detected so far and also the coolest. The planet most certainly has a rocky/icy surface. Its discovery marks a groundbreaking result in the search for planets that support life.

Money for science

Spending money on science has been tied to evaluating the value of spin-offs, assessing the link between technological advancement and GDP, and dissecting the metrics of productivity, but the debate won’t ever settle no matter how convincingly each time it is resolved.

For a piece titled The Telescope of the 2030s, Dennis Overbye writes in The New York Times,

I used to think $10 billion was a lot of money before TARP, the Troubled Asset Relief Program, the $700 billion bailout that saved the banks in 2008 and apparently has brought happy days back to Wall Street. Compared with this, the science budget is chump change, lunch money at a place like Goldman Sachs. But if you think this is not a bargain, you need look only as far as your pocket. Companies like Google and Apple have leveraged modest investments in computer science in the 1960s into trillions of dollars of economic activity. Not even Arthur C. Clarke, the vaunted author and space-age prophet, saw that coming.

Which is to say that all that NASA money — whether for planetary probes or space station trips — is spent on Earth, on things that we like to say we want more of: high technology, education, a more skilled work force, jobs, pride in American and human innovation, not to mention greater cosmic awareness, a dose of perspective on our situation here among the stars.

And this is a letter from Todd Huffman, a particle physicist at Oxford, to The Guardian:

Simon Jenkins parrots a cry that I have heard a few times during my career as a research scientist in high-energy physics (Pluto trumps prisons when we spend public money, 17 July). He is unimaginatively concerned that the £34m a year spent by the UK at Cern (and a similar amount per year would have been spent on the New Horizons probe to Pluto) is not actually money well spent.

Yet I read his article online using the world wide web, which was developed initially by and for particle physicists. I did this using devices with integrated circuits partly perfected for the aerospace industry. The web caused the longest non-wartime economic boom in recorded history, during the 90s. The industries spawned by integrated circuits are simply too numerous to count and would have been impossible to predict when that first transistor was made in the 50s. It is a failure of society that funnels such economic largesse towards hedge-fund managers and not towards solving the social ills Mr Jenkins rightly exposes.

Conflict of interest? Not really. Science is being cornered from all sides and if anyone’s going to defend its practice, it’s going to be scientists. But we’re often so ready to confuse participation for investment, and at the first hint of any allegation of conflict, don’t wait to verify matters for ourselves.

I’m sure Yuri Milner’s investment of $100 million today to help the search for extra-terrestrial intelligence will be questioned, too, despite Stephen Hawking’s moving endorsement of it:

Somewhere in the cosmos, perhaps, intelligent life may be watching these lights of ours, aware of what they mean. Or do our lights wander a lifeless cosmos — unseen beacons, announcing that here, on one rock, the Universe discovered its existence. Either way, there is no bigger question. It’s time to commit to finding the answer – to search for life beyond Earth. We are alive. We are intelligent. We must know.

Pursuits like exploring the natural world around us are, I think, what we’re meant to do as humans, what we must do when we can, and what we must ultimately aspire to.

It’s just good business. Credit: balleyne/Flickr, CC BY 2.0.

DoT backs net neutrality but wants end to free domestic Skype, WhatsApp calls

The Wire
July 17, 2015

It’s just good business. Credit: balleyne/Flickr, CC BY 2.0.
It’s just good business. Credit: balleyne/Flickr, CC BY 2.0.

A Department of Telecommunications committee has released a report on the issue of net neutrality, following the controversial policy consultation paper that the Telecom Regulatory Authority of India put out in May. The report falls in life with many of the popular demands that surged on social media following the TRAI paper, and includes this telling line: “The Committee is of the view that the statement of [telecom companies] that they are under financial stress due to the rapidly falling voice revenues and insufficient growth in data revenues, is not borne out by evaluation of financial data.”

At the same time, it also tucks in a potentially controversial suggestion that could rekindle debate: of regulating domestic calls made through VoIP-enabled over-the-top (OTT) services like WhatsApp and Viber through the Telegraph Act, while leaving alone international calls made through the same apps. It remains to be seen how many of the report’s recommendations TRAI will adopt.

One of the more contentious topics in the the TRAI paper was if OTT services like Facebook and WhatsApp, called so because they rely on local Internet service providers to relay data between their applications and users, should be regulated in India. The DoT report states that non-VoIP OTTs, as well as application-based services like Uber and Ola Cabs, won’t be regulated. VoIP stands for voice-over Internet Protocol, the use of an Internet connection to make phone calls.

Strangely, the report marks a distinction between domestic calls made through VoIP OTTs and international VoIP OTTs, and recommends that only the former be regulated. The ostensible reason for this is that the DoT wants to protect the revenues of telecom companies and, possibly, doesn’t want to interfere with the millions of middle-class Indians who keep in touch with their sons and daughters abroad. But no explicit reason for this differentiation has been provided. In fact, as Pranesh Prakash of the Centre for Internet and Society pointed out on Twitter, the DoT’s suggested use of licenses to regulate such VoIP OTTs isn’t a net-neutrality issue in the first place.

Beyond this sore point: the report also examines how – and how not to – examine data packets flowing through the ‘pipes’, or connections between nodes, of the Internet, and expressly rules out the illegal use of deep packet inspection. Deep packet inspection is a technique often used on networks to eavesdrop on data as it passes through a pipe. The document has also been courageous enough to admit that not all zero-rating plans “are controversial or against the net neutrality principles”. Zero-rating is akin to a toll-gate within a pipe which allows data of some forms or originating from certain sources to pass through without a fee while taxing the rest. Such implementations could be useful when providing government services – like railway bookings – for cheap to the rural poor, but at the same time would have to be protected from non-competitive uses by private enterprises.

Thus, the report recommends “the incorporation of a clause in the license conditions of TSP/ISPs that will require the licensee to adhere to the principles and conditions of Net Neutrality specified by guidelines issued by the licensor from time to time”.

Beyond the questions surrounding net neutrality, the report also takes a stand on India’s digital sovereignty, taking cognisance of the fact that “there is a need for a balance to be drawn to retain the country’s ability to protect the privacy of its citizens and data protection without rendering it difficult for business operations”. It goes on to suggest that the TRAI could “identify critical and important areas through public consultations” when the question of hosting data locally – in servers as well as pipes physically located in the country – arises. Now, the ball is decidedly in TRAI’s court, and it would be unfair to say the body isn’t under pressure to implement what appears to be an amenable report from the DoT.

Is the M5 star cluster really out there? Credit: HST/ESA/NASA

Of small steps and giant leaps of collective imagination

The Wire
July 16, 2015

Is the M5 star cluster really out there? Credit: HST/ESA/NASA
Is the M5 star cluster really out there? Credit: HST/ESA/NASA

We may all harbour a gene that moves us to explore and find new realms of experience but the physical act of discovery has become far removed from the first principles of physics.

At 6.23 am on Wednesday, when a signal from the New Horizons probe near Pluto reached a giant antenna in Madrid, cheers went up around the world – with their epicentre focused on the Applied Physics Laboratory in Maryland, USA.

And the moment it received the signal, the antenna’s computer also relayed a message through the Internet that updated a webpage showing the world that New Horizons had phoned home. NASA TV was broadcasting a scene of celebration at the APL and Twitter was going berserk as usual. Subtract these instruments of communication and the memory of humankind’s rendezvous with Pluto on the morning of July 15 (IST) is delivered not by the bridge of logic but a leap of faith.

In a memorable article in Nature in 2012, the physicist Daniel Sarewitz made an argument that highlighted the strength and importance of good science communication in building scientific knowledge. Sarewitz contended that it was impossible for anyone but trained theoretical physicists to understand what the Higgs boson really was, how the Higgs mechanism that underpins it worked, or how any of them had been discovered at the Large Hadron Collider earlier that year. The reason, he said, was that a large part of high-energy physics is entirely mathematical, devoid of any physical counterparts, and explores nature in states the human condition could never physically encounter.

As a result, without the full knowledge of the mathematics involved, any lay person’s conviction in the existence of the Higgs boson would be punctured here and there with gaps in knowledge – gaps the person will be continuously ignoring in favour of the faith placed in the integrity of thousands of scientists and engineers working at the LHC, and in the comprehensibility of science writing. In other words, most people on the planet won’t know the Higgs boson exists but they’ll believe it does.

Such modularisation of knowledge – into blocks of information we know exist and other blocks we believe exist – becomes more apparent the greater the interaction with sophisticated technology. And paradoxically, the more we are insulated from it, the easier it is to enjoy its findings.

Consider the example of the Hubble space telescope, rightly called one of the greatest astronomical implements to have ever been devised by humankind.

Its impressive suite of five instruments, highly polished mirrors and advanced housing all enable it to see the universe in visible-to-ultraviolet light in exquisite detail. Its opaque engineering is inaccessible to most but this gap in public knowledge has been compensated many times over by the richness of its observations. In a sense, we no longer concern ourselves with how the telescope works because we have drunk our fill with what it has seen of the universe for us – a vast, multihued space filled with the light of a trillion stars. What Hubble has seen makes us comfortable conflating belief and knowledge.

The farther our gaze strays from home, the more we will become reliant on technology that is beyond the average person’s intellect to comprehend, on rules of physics that are increasingly removed from first principles, on science communication that is able to devise cleverer abstractions. Whether we like it or not, our experience, and memory, of exploration is becoming more belief-ridden.

Like the Hubble, then, has New Horizons entered a phase of transience, too? Not yet. Its Long-Range Reconnaissance Imager has captured spectacular images of Pluto, but none yet quite so spectacular as to mask our reliance on non-human actors to obtain them. We know the probe exists because the method of broadcasting an electromagnetic signal is somewhat easily understood, but then again most of us only believe that the probe is functioning normally. And this will increasingly be the case with the smaller scales we want to explore and the larger distances we want to travel.

Space probes have always been sophisticated bits of equipment but with the Internet – especially when NASA TV, DSN Now and  Twitter are the prioritised channels of worldwide information dissemination – there is a perpetual yet dissonant reminder of our reliance on technology, a reminder of the Voyager Moment of our times being a celebration of technological prowess rather than exploratory zeal.

Our moment was in fact a radio signal reaching Madrid, a barely romantic event. None of this is a lament but only a recognition of the growing discernibility of the gaps in our knowledge, of our isolation by chasms of entangled 1s and 0s from the greatest achievements of our times. To be sure, the ultimate benefactor is science but one that is increasingly built upon a body of evidence that is far too specialised to become something that can be treasured equally by all of us.

Instead of reaching the sky, Aakash ends up six feet below

The Wire
July 15, 2015

Once at the centre of the Indian government’s half-baked schemes to make classrooms tech-savvy, the Aakash project wound down quietly in March 2015, an RTI has revealed. The project envisaged lakhs of school and engineering students armed with a tablet each, sold at Rs.1,130 courtesy the government, from which they partake of their lessons, access digitised textbooks and visualise complicated diagrams. Lofty as these goals were, the project was backed by little public infrastructure and much less coordination, resulting in almost no traction despite being punctuated regularly with PR ops.

The project was conceived in 2011 by the UPA-2 government to parallel the One Laptop Per Child program, forgetting conveniently that the latter worked only in small Uruguay and for unique reasons. Anyway, a British-Canadian company named DataWind was contracted to manufacture the tablets, which the government would then purchase for Rs.2,263 and subsidise so as to retail them at Rs.1,130.

However, the second version, whose development was led by IIT-Bombay and the Centre for Development of Advanced Computing, released in November 2012 bordered on the gimmicky. It had 512 MB RAM, a 7” screen, a 1 GHz processor and, worst of all, a battery that lasted all of three hours even as a full day at school typically spanned seven. Even so, the government announced that 50 lakh such tablets would be manufactured and that 1 lakh teachers would be trained to use it. In March 2013, then Union HRD Minister Pallam Raju called it the government’s “dream project”.

But what really crippled the program was not the operational delays or logistical failures but the Central government’s lackadaisical assumption that placing a tablet in a student’s hands would solve everything. For example, it was advertised that Aakash would be a load off children’s backs, eliminating the need to lug around boatloads of books. However, the NCERT didn’t bother to explain which textbooks would be digitised first – or at all – and when they’d be available. Similarly, the low-income households whose younger occupants the tablets targeted didn’t have access to regular electricity let alone an Internet connection. What the tablets would ultimately do was become, for those who couldn’t afford to maintain and use them, a burden.

The Aakash train on the other hand was on rails of its own. By November 2011, DataWind had shipped 6,440 devices but only 650 were found good enough to sell. Nonetheless, in January 2013, IIT-Bombay announced it was starting work on Aakash 3, and by July the same year had skipped to working on the fourth iteration. Then, in September 2013, the CAG alleged that IIT-Rajasthan, which had handled the Aakash project in 2011, had been awarded the project arbitrarily, received Rs.47.42 crore without any prior feasibility checks, and overran its budget by Rs.1.05 crore. However, this did nothing to slow things down.

The biggest beneficiary was DataWind, the air in its bellows blown by the Central government’s fantasy of arming itself with the same cargo that Western institutions sported. Between December 2013 and July 2014, the company was able to announce three new models in the Rs.4,000-7,000 price range, introduce one for the UK priced at ₤30, raised Rs.168 crore in an IPO, listed on the Toronto Stock Exchange and got on the MIT Tech Review’s 50 smartest tech companies of 2014 list for breaking “the price barrier”.

The RTI application that revealed Aakash had been wound down also received the reply that the project had achieved all its objectives: of procuring one lakh devices, testing them and establishing 300 centres in engineering colleges – speaking nothing of the more ostentatious goal of linking 58.6 lakh students across 25,000 colleges and 400 universities through an ‘e-learning’ program. The reply also stated that specifications for a future device had been submitted to the MHRD. Whether the project will be revived by the ruling BJP government later is unknown.

And on that forgettable note of uncertainty, one of the more misguided digital-India schemes comes to a close.

Yoichiro in Nambu in 2008. Source: University of Chicago

Yoichiro Nambu, the silent revolutionary of particle physics, is dead

The Wire
July 18, 2015

Particle physics is an obscure subject for most people but everyone sat up and took notice when the Large Hadron Collider discovered the particle named after Peter Higgs in 2012. The Higgs boson propelled his name to the front pages of newspapers that until then hadn’t bothered about the differences between bosons and fermions. On the other hand, it also validated a hypothesis he and his peers had made 50 years ago and helped the LHC’s collaborations revitalise their outreach campaigns.

However, much before the times of giant particle colliders – in the late 1950s, in fact – a cascade of theories was being developed by physicists the world over with much less fanfare, and a lot more of the quiet dignity that advanced theoretical physics is comfortable revelling in. It was a silent revolution, and led in part by the mild-mannered Yoichiro Nambu, who passed away on July 5, 2015.

His work and its derivatives gave rise to the large colliders like the LHC at work today, and which might well have laid the foundations of modern particle physics research. Moreover, many of his and his peers’ accomplishments are not easily discussed the way political movements are nor do they aspire to such privileges, but that didn’t make them any less important than the work of Higgs and others.

Yoichiro Nambu also belonged to a generation that marked a resurgence in Japanese physics research – consider his peers: Yoshio Nishina, Masatoshi Koshiba, Hideki Yukawa, Sin-Itiro Tomonaga, Leo Esaki, Makoto Kobayashi and Toshihide Maskawa, to name a few. A part of the reason was a shift in Japan’s dominant political attitudes after the Second World War. Anyway, the first of Nambu’s biggest contributions to particle physics came in 1960, and it was a triumph of intuition.

There was a span of 46 years between the discovery of superconductivity (by Heike Kamerlingh Onnes in 1911) and the birth of a consistent theoretical explanation for it (by John Bardeen, Leon Cooper and John Schrieffer in 1957) because the phenomenon seemed to defy some of the first principles of the physics used to understand charged particles. Nambu was inspired by the BCS theory to attempt a solution for the hierarchy problem – which asks why gravity, among the four fundamental forces, is 1032 times weaker than the strongest strong-nuclear force.

With the help of British physicist Jeffrey Goldstone, Nambu theorised that whenever a natural symmetry breaks, massless particles called Nambu-Goldstone bosons are born under certain conditions. The early universe, around 13.75 billion years ago when it was extremely small, consisted of a uniform pond of unperturbed energy. Then, the pond was almost instantaneously heated to a temperature of 173 billion Suns, when it broke into smaller packets called particles. The symmetry was (thought to be) spontaneously broken and the event was called the Big Bang.

Then, as the universe started to cool, these packets couldn’t reunify into becoming the pond they once made up, evolving instead into distinct particles. There were perturbations among the particles and the resultant forces were mediated by what came to be called Nambu-Goldstone bosons, named for the physicists who first predicted their existence.

Yoichiro in Nambu in 2008. Source: University of Chicago
Yoichiro in Nambu in 2008. Source: University of Chicago

Nambu was able to use the hypothetical interactions between the Nambu-Goldstone bosons and particles to explain how the electromagnetic force and the weak nuclear force (responsible for radioactivity) could be unified into one electroweak force at higher temperatures, as well as how where the masses of protons and neutrons come from. These were (and are) groundbreaking ideas that helped scientists make sense of the intricate gears that turned then to make the universe what it is today.

Then, in 1964, six physicists (Higgs, Francois Englert, Tom Kibble, Gerald Guralnik, C.R. Hagen, Robert Brout) postulated that these bosons interacted with an omnipresent field of energy – called the Higgs field – to give rise to the strong-nuclear, weak-nuclear (a.k.a. weak) and electromagnetic forces, and the Higgs boson. And when this boson was discovered in 2012, it validated the Six’s work from 1964.

However, Nambu’s ideas – as well as those of the Six – also served to highlight how the gravitational force couldn’t be unified with the other three fundamental forces. In the 1960s, Nambu’s first attempts at laying out a framework of mathematical equations to unify gravity and the other forces gave rise to the beginnings of string theory. But in the overall history of investigations into particle physics, Nambu’s work – rather, his intellect – was a keystone. Without it, the day theorists’ latinate squiggles on paper could’ve become prize-fetching particles in colliders would’ve been farther off, the day we made sense of reality farther off, the day we better understood our place in the universe farther off.

The Osaka City University, where Nambu was a professor, announced his death on July 17, due to an acute myocardial infarction. He is survived by his wife Chieko Hida and son John. Though he was an associate professor at Osaka from 1950 to 1956, he visited the Institute for Advanced Study at Princeton in 1952 to work with Robert Oppenheimer (and meet Albert Einstein). Also, in 1954, he became a research associate at the University of Chicago and finally a professor there in 1958. He received his American citizenship in 1970.

Peter Freund, his colleague in Chicago, described Nambu as a person of incredible serenity in his 2007 book A Passion for Discovery. Through the work and actions of the biggest physicists of the mid-19th century, the book fleshes out the culture of physics research and how it was shaped by communism and fascism. Freund himself emigrated from Romania to the US in the 1960s to escape the dictatorial madness of Ceausescu, a narrative arc that is partially reflected in Nambu’s life. After receiving his bachelor’s degree from the University of Tokyo in 1942, Nambu was drafted into the army and witnessed the infamous firebombing of Tokyo and was in Japan when Hiroshima and Nagasaki were bombed.

The destructive violence of the war that Nambu studied through is mirrored in the creative energies of the high-energy universe whose mysteries Nambu and his peers worked to decrypt. It may have been a heck of a life to live through but the man himself had only a “fatalistic calm”, as Freund wrote, to show for it. Was he humbled by his own discoveries? Perhaps, but what we do know is that he wanted to continue doing what he did until the day he died.

What you need to know about the Pluto flyby

The Wire
July 14, 2015

In under seven hours, the NASA New Horizons space probe will flyby Pluto at 49,900 km per hour, from a distance of 12,500 km. It’s what the probe set out to do when it was launched in January 2006. The flyby will allow it to capture high-resolution images of the dwarf planet’s surface and atmosphere as well as take a look at its biggest moon, Charon. For much of the rest of the day, it will not be communicating with mission control as it conducts observation. The probe’s Long-Range Reconnaissance Imager (LORRI) has already been sending better and better pictures of Pluto as it gets closer. During closest approach, Pluto will occupy the entire field of view of LORRI to reveal the surface in glorious detail.

Fourteen minutes into the Pluto flyby, New Horizons will make its closest approach to Charon, which is about 24,000 km away. Next: 47 minutes and 28 seconds after the Charon flyby, the probe will find itself in Pluto’s shadow where its high-gain antennae will make observations of how the dwarf planet’s atmosphere affects sunlight and radio signals from Earth as they pass through it. Then, 1 minute and 2 seconds after that, New Horizons will again be in sunlight. Finally, 1 hour and 25 minutes later, it will be in Charon’s shadow to look for its atmosphere.

That New Horizons survived the flyby will be known when, on early Wednesday morning (IST), it starts to send communication signals Earthward again. The timings of various events announced by NASA will have to be adjusted against the fact that New Horizons is 4.5 light-hours away from Earth. NASA has called for a press conference to release the first close-up images at 0030 hrs on July 16 (IST). The entire data snapped by the probe during the flyby will be downloaded over a longer period of time. According to Emily Lakdawalla,

Following closest approach, on Wednesday and Thursday, July 15 and 16, there will be a series of “First Look” downlinks containing a sampling of key science data. Another batch of data will arrive in the “Early High Priority” downlinks over the subsequent weekend, July 17-20. Then there will be a hiatus of 8 weeks before New Horizons turns to systematically downlinking all its data. Almost all image data returned during the week around closest approach will be lossily compressed — they will show JPEG compression artifacts. Only the optical navigation images are losslessly compressed. [All dates/times in EDT]

Downloading the entire science dataset including losslessly compressed observations will take until around November 2016 to complete. Until then, the best will always be yet to come. As always, all communications will be via the Deep Space Network – whose Goldstone base is currently all ears for the probe.

DSN Now. Source: Screengrab
DSN Now. Source: Screengrab

Incidentally, the ashes of the astronomer Clyde Tombaugh, who discovered Pluto in 1930, are onboard New Horizons.

What do we know about Pluto?

Among the last images taken by LORRI before the flyby revealed a strange geology on Pluto. Scientists noted dark and bright polygonal patches (in the shape of a whale and a <3, respectively) as well as what appeared to be ridges, cliffs and several impact craters. However, these features on the side of Pluto facing New Horizons as it flies in. During the flyby, it will image the other side of Pluto, where these features may not be present. The probe can’t hang around to wait to see the other side either because Pluto rotates once every 6.4 Earth-days.

An annotated image of Pluto snapped by the New Horizons probe. Credit: Applied Physics Lab/NASA
An annotated image of Pluto snapped by the New Horizons probe. Credit: Applied Physics Lab/NASA

During the flyby, images of Charon will also be taken. Already, the probe has revealed that, like Pluto, the moon also has several intriguing features – while until recently both bodies were thought to be frozen and featureless balls of ice and rock – like giant craters and chasms. In fact, NASA noted one crater near Charon’s south pole, almost 100 km wide and another on Pluto, some 97 km wide, both appearing to have been the result of recent impacts (in the last billion years). The particularly dark appearance of the Charon crater has two theories to explain it. Either the ice at its bottom is of a different kind than the usual and is less reflective or the ice melted during impact and then refroze into larger, less bright grains.

An annotated image of Charon snapped by the New Horizons probe. Credit: Applied Physics Lab/NASA
An annotated image of Charon snapped by the New Horizons probe. Credit: Applied Physics Lab/NASA

All these details will be thrown up in detail during New Horizons’ flyby. They will reveal how the two bodies evolved in the past, the structure and composition of their interiors, and if – for some astronomers – Charon might’ve harboured a subsurface ocean in its past. Complementarily, NASA will also be training the eyes of its Cassini, Spitzer and Keplerspace-borne instruments on Pluto. Cassini, from its orbit around Saturn, will take a picture of New Horizons just around the time of its flyby. From July 23 to July 30, the Spitzer Space Telescope will study Pluto in the infrared, mapping its surface ice. Then, in October, the exoplanet-hunting Kepler telescope, in its second avatar as K2, will start focusing on the changes in brightness off of and around Pluto to deduce the body’s orbital characteristics.

Then, there are also post-flyby missions whose results, when pieced together with the July 14 flyby and other observations, will expand our knowledge of Pluto in its larger environment: among the Kuiper Belt, at whose inner edge it resides.

Finally, as Dennis Overbye of The New York Times argued in a poignant essay, the Pluto flyby marks the last of the Solar System’s classical planets to explored, the last of the planets the people of our generation will get to see up close. The next frontiers in planetary exploration will be the exoplanets – the closest of which is 4.3 light-years away (orbiting Alpha Centauri B). But until then, be willing to consider the Solar System’s moons, missions to which are less than a decade away. Leaving you with Overbye’s words:

Beyond the hills are always more hills, and beyond the worlds are more worlds. So New Horizons will go on, if all goes well, to pass by one or more of the cosmic icebergs of the Kuiper belt, where leftovers from the dawn of the solar system have been preserved in a deep freeze extending five billion miles from the sun…

But the inventory of major planets — whether you count Pluto as one of those or not — is about to be done. None of us alive today will see a new planet up close for the first time again. In some sense, this is, as Alan Stern, the leader of the New Horizons mission, says, “the last picture show.”

Physicists find exotic particle with five quarks

The Wire
July 14, 2015

An artist's impression of five strongly bonded quarks in a pentaquark. Credit: CERN/LHCb Collaboration
An artist’s impression of five strongly bonded quarks in a pentaquark. Credit: CERN/LHCb Collaboration

“When you have eliminated the impossible, whatever remains, however improbable, must be the truth” – thus spake Sherlock Holmes. Particle physicists at the Large Hadron Collider today announced the discovery of a new particle after an investigation following in the steps of Holmes’ wisdom. The particle is exceedingly rare in the books of fundamental physics, called a pentaquark. It’s named for the fact that it’s composed of five quarks, indivisible particles that in their leagues make up all known matter.

However, this is the first time experimental physicists have observed five quarks coming together to make a bigger particle. They commonly manifest as protons and neutrons, which are clumps of three quarks each.

The collaboration of scientists and engineers of the LHCb detector – which spotted the pentaquarks – uploaded a paper to the arXiv preprint server on July 13 and submitted a copy to the journal Physical Review Letters for publication. The abstract describes two resonances – or unstable particles – at masses 4,380 MeV and 4,449.8 MeV (to compare, a proton weighs 938 MeV), not including uncertainties in the range 40-110 MeV. The have been temporarily designated Pc(4380)+ and Pc(4450)+.

The LHCb detector spotted the pentaquarks during the particle decays of another particle called Λb (read Lambda b). However, instead of discerning their presence by a spike in the data, the scientists spotted them by accounting for all other data points and then figuring out one consistent explanation for what was leftover. And the explanation called for conceding that the scientists had finally spotted the elusive pentaquark. “Benefitting from the large data set provided by the LHC, and the excellent precision of our detector, we have examined all possibilities for these signals, and conclude that they can only be explained by pentaquark states”, said LHCb physicist Tomasz Skwarnicki of Syracuse University in a statement.

According to the pre-print paper, the chances of the observation being a fluke, or due to some other process that could’ve mimicked the production of pentaquarks, are less than 1-in-3.5-million. As a result, the observations are sufficiently reliable and make for a discovery – even if the particle wasn’t observed as much as its unique shadow. At the same time, because the history of the experimental pursuit of pentaquarks is dotted with shepherds crying wolves, the data will be subjected to further scrutiny. In the most recent and famous case in 2003, four research labs from around the world (TJNAF, AITEP, SPring-8, ELSA) claimed to have spotted pentaquarks, only to be disproved by tests at the Istituto Nazionale di Fisica Nucleare in Genova in April 2005.

The LHC, which produces the high-energy collisions that detectors like the LHCb study in detail, shut down in early 2013 for a series of upgrades and reawakened in May 2015. The pentaquark was found in data gathered during the first run, when the LHC produced collisions at an energy of 8 TeV (1 TeV is 1,000 MeV). In the second run, the collision energy has been hiked to 13 TeV, which increases the frequency with which exotic particles like pentaquarks could be produced.

WikiLeaked emails of IT firm show India as ‘really huge’ market for snooping, spyware

The Wire
July 10, 2015

Bangalore: Indian intelligence agencies and police forces are listed among the customers of an Italian company accused of selling spyware to repressive regimes.

Apart from the NIA, RAW, the intelligence wing of the Cabinet Secretariat and the Intelligence Bureau, the Gujarat, Delhi, Andhra and Maharashtra state police forces also earn a mention in the latest dump of files by WikiLeaks, which outs emails exchanged within the Milan-based IT company called Hacking Team, a purveyor of hacking and surveyor tools.

In August 2011, an Israeli firm named NICE, in alliance with Hacking Team, discusses opportunities to loop in the Cabinet Secretariat. In February 2014, Hacking Team’s employees seem to be discussing the organisation of a webinar in which senior officials of the NIA, RAW and IB will be participating. While this may be deflected as an intelligence agency’s need for counter-surveilling recruitment attempts by groups like the Islamic State, it is unclear why the Andhra Pradesh police asked for equipment to snoop on mobile phones (“cellular interception hardware solutions”).

In fact, in an email dated June 8, 2014, a hacking team employee named David Vincenzetti writes India “represents a huge – really huge and largely untapped – market opportunity for us”. Vincenzetti appears to have been spurred by former foreign minister Salman Khurshid’s infamous remark defending the American NSA’s widespread snooping and following Edward Snowden’s revelation that India was the fifth-most targeted nation: “It’s not really snooping.”

Most of the emails relevant to Indian interests, dating from 2011 to 2015, discuss the “world’s largest democracy” as representing a big opportunity for Hacking Team. They also betray the company’s interest in Pakistan as a client, especially referencing India’s alliance with the United States following the war in Afghanistan and Osama bin Laden’s assassination in Abbottabad.A rough translation of the following statement in one of the emails, dated May 4, 2011

Il Pakistan e’ un paese diviso, fortemente religioso, rivale dell’India ma alletato degli US per la guerra in Afganistan. Dagli US riceve ogni anno >$4bn per aiuti civili e armamenti. Potrebbe essere un cliente eccezionale.

… reads: “Pakistan receives more than $4 billion for civilian aid and arms from the US. It’s a divided country and very religious, and its rival India is allied with the US for the war in Afghanistan.”

Apart from discussing the Pakistani civilian government as a potential business partner for its strategic positioning, Hacking Team emails also reveal that the company performed demonstrations for the erstwhile UPA-II government on how to snoop on phones and other mobile devices, once in 2011 at the invitation of the Indian embassy in Italy.

The emails were obtained by a hacker (or a group of hackers) identified as PhineasFinisher, who/which dug into the Hacking Team’s internal communications and released them on July 6. PhineasFinisher also hacked the company’s Twitter account and renamed it Hacked Team, apart from launching a GitHub repository. The haul totaled over 400 GB, and included sensitive information from other infosec companies as the British-German Gamma Group and the French VUPEN. In fact, the haul includes a 40-GB tranche from Gamma.

The hacker(s) subsequently went on to declare that he/they would wait for Hacking Team to figure out how they system was hacked, failing which he/they will reveal it himself/themselves.

Apart from discussing the sale of interception equipment, the company was also considering a proposal to the Indian government to better wiretap data sent through RIM servers. RIM, or Research In Motion, is the Canadian company that owns Blackberry. For long, the Indian government was trying to get RIM to hand over users’ data exchanged over Blackberry devices for civilian surveillance purposes.

Il BBM non puo’ essere intercettato. Ma con RCS si’. [The BBM cannot be intercepted. But with RCS, yes.]

RCS stands for Hacking Team’s proprietary snooping software Remote Control System (RCS).

Ad ogni modo RCS offre possibilita’ ulteriori rispetto a qualunque sistema di intercettazione passiva come la cattura di dati che tipicamente non viaggiano via rete (rubrica, files, foto, SMS vecchi salvati, ecc.) e la possibilita’ di “seguire” un target indiano quando questo si reca all’estero (e.g., Pakistan). [However RCS offers a possibility more than that of any system of passive interception, with the capture of data that typically does not travel over the network (phonebook, files, photos, SMS old saved, etc.). And the possibility to “follow” an Indian target when it goes abroad (e.g., to Pakistan).]

Other state clients for RCS and other Hacking Team services include serial human-rights abusers Saudi Arabia, Ethiopia, Uzbekistan and Russia.

Although Hacking Team has gone on to vehemently deny the allegations, the information released in the WikiLeaks dump remains undeniable for now. And so, it’s unsettling at the least that the Indian government is seen to be in cahoots with a company that feeds on the persistence of human rights violations around the world.

Open defecation affects pregnancy outcomes too, new study finds

The Wire
July 7, 2015

Eradicating open defecation has become one of the biggest, and most daunting, challenges faced by the Narendra Modi government. Poor sanitation practices like open defecation have been known to cause cause stunting in children, not to mention a contamination of water resources leading to cholera, typhoid and dysentry. The mechanism of these adverse outcomes is the effect of the unhygienic biosphere the open fecal matter results in, festering bacteria, viruses and other contaminants that poison food and water.

In the same way, a group of researchers from India and the United States asked themselves: can open defecation also affect pregnancy outcomes? “One of the possible consequences, apart from diarrhea and other gastrointestinal infections due to fecal oral contamination, of open defecation could be on women’s genito-urinary tract due to the proximity of the vagina to the anus,” Pinaki Panigrahi, a professor of epidemiology at the University of Nebraska Medical Centre and an author of the study, told The Wire.

And if this is indeed a plausible mechanism of infection, the adverse effects will not just be experienced by the mother but by the child as well. To investigate, Dr. Panigrahi and her colleagues – including from the Asian Institute of Public Health, Bhubaneshwar – enrolled a cohort of 670 women in their first trimesters of pregnancy from four villages from the Sundargarh and Khurda districts of Odisha. The state was chosen because it has an infant mortality rate of 53 per 1,000 live births and a maternal mortality rate of 235 per 100,000 women of reproductive age – both very high – apart from 75% of the population practicing open defecation. Then, the women’s health and sanitary practices were documented until they gave birth.

Locations of the villages in Odisha state.
Locations of the villages in Odisha state.

The team found a “statistically significant association” between open defecation and pregnancy outcomes, which means there was a correlation between the two that persisted in various circumstances. Of the 670 women enrolled, 667 completed the study. Of those 667, 172 (28.2%) experienced adverse pregnancy outcomes – such as preterm births (130), babies with low birth-weight (95), spontaneous abortions (11) and stillbirths (6). However, despite weaning out many confounding factors, the study does not make a distinction between pregnancy outcomes due to open defecation and outcomes due to hand-washing practices after defecation.

For example, in the cohort, 58.17% (388) did not have access to latrines. Among those who did have access, less than half – 45.8% – used it on a regular basis. The team also found that 58% of pregnant women did not wash their hands with soap or detergent after defecation, a number that clearly includes a lot of the women who had indoor loos as well. With this in mind, the study’s distinction between indoor and outdoor defecation boils down to how hand-washing practices were affecting the maternal outcomes of pregnant women. This is highlighted by the fact that out of the 279 women who had access to a latrine, only 136 had a washing station at or near it – while 278 women out of the total 667 reported using a soap or detergent after defecating.

Adverse pregnancy outcomes stratified by sanitation characteristics.
Adverse pregnancy outcomes stratified by sanitation characteristics.

The team was also able to find that it didn’t matter if the women who defecated in the open were wealthier or poorer, their odds of having a sub-par pregnancy were similar. What really improved the odds of a healthy pregnancy was education. An influential study by Thomas Clasen of Emory University, Georgia, also had similar findings in October 2014: it wasn’t enough to put up accessible latrines but to make women understand that they needed to use them. Apart from reinforcing old attitudes, however, the findings suggest a new ‘front’ on which to tackle the sanitation crisis facing the country from a policy standpoint.

But before that, to establish the effects of open defecation exclusively on pregnancy, the mechanism of infection needs to be found. “What does open defecation do? Induces an infection in the urogenital tract? A type of inflammation due to an infection or aberrant bacterial colonization of the vaginal tract? Or simply raises the stress level of the pregnant woman while going out to defecate in harsh environments, being exposed to unknown men, or simply withholding the urge to defecate or urinate?” We don’t know, and this is what studies of the future will focus on, according to Dr. Panigrahi.

Anyway, they excluded practices like smoking, alcohol use, history of STDs, antenatal complications and history of abortions. The rationale is that smoking, for example, doesn’t affect the relationship between open defecation and pregnancy outcomes even if it affects pregnancy by a separate mechanism of its own. Dr. Panigrahi asserts: “After including dozens of factors, we still find a significant association. By adding one or two new factors if at all will have a minimal dent on the statistical model we used.”

The study was published in the open-access journal PLOS Medicine on July 7.

A telescope that gives India a new place in the Sun

The Wire
July 8, 2015

An attempt to make sense of anything in the Solar System cannot happen without first acknowledging the presence, and effects, of the Sun at its centre. And in an effort to expand this understanding, the Udaipur Solar Observatory, Rajasthan, recently added a versatile telescope to its line-up.

The USO is part of a network of six observatories on Earth, a network that continuously monitors activities on the Sun’s surface to determine why solar flares occur and what their impacts are. Flares are violent ejections of particles, heat and magnetic energy by the Sun. Even if Earth’s magnetic field constantly keeps the particles from coming too close, satellites are constantly under threat of being struck by them, disrupting electronic communications. Flares also intensify auroras around latitudes close to the poles.

On June 16, the new Multi Application Solar Telescope was flagged off. According to ISRO, it significantly expands the observatory’s capabilities and makes for a versatile tool with which to study the Sun’s complex magnetic fields. Its setup was funded by the Department of Space under the Ministry of Science and Technology. In addition to helping astronomers study solar eruptions, MAST will also complement the existing GREGOR and SOLAR B telescopes, both also studying the Sun. GREGOR is located on the Canary Islands and is operated by Germany while SOLAR B, now called Hinode, is in a sun-synchronous orbit in space and was launched by the Japanese space agency in 2006.

MAST is a Gregorian-Coude telescope with an aperture of 50 cm. The Gregorian in its genus alludes to the use of a combination of lenses and mirrors in a telescope in which the final image is not upside-down but upright. The coude – French for elbow – is a structural arrangement where the observation deck doesn’t move when the telescope does.

A proposal for MAST was first pitched by USO in 2004, the optical elements were fabricated in 2008 and the telescope was installed in 2012 following which it underwent testing. USO itself is situated in the middle of Lake Fatehsagar in Udaipur, and its location proved apt for MAST as well. When a telescope on land makes an observation, the light it receives will be distorted by the hot air it passes through. The hotter the air is, often because of the surface underneath, the more the distortion. But in the middle of a lake, the distortion is minimised because air above the lake is relatively cooler and less prone to ‘dancing’ around.

In the next five years, MAST hopes to help obtain a 3D image of the Sun’s atmosphere during times of increased activity. When particles are accelerated to high energies, solar spots form, and magnetic fields whip through in strange patterns. Even if these patterns are recognisable, many of their causes and effects are unknown while periodic shifts in their intensity and distribution are known to be associated with different phases of a star’s life of billions of years.

Researchers working on telescopes like MAST, GREGOR, BBSO, Hinode, the Solar Dynamics Observatory and others hope to collaborate and resolve these mysteries. The USO in particular hopes to use MAST to chart the directions of magnetic fields that move through the Sun’s photosphere – its outermost layer from which sunlight emerges – and the chromosphere – the atmospheric layer right above the photosphere.

The observations it will make will also be beneficial to the ISRO’s plans – for research as well as, and leading up to, space exploration. Effects of events that play out on the Sun’s surface have consequences that reach well beyond the orbit of Pluto, shaping the composition and atmospheres of all planets on the way. In fact, it was thanks to a solar flare in March 2012 that humankind found out the Voyager 1 probe had exited the Solar System. So plans to chart the interplanetary oceans must accommodate the Sun’s tantrums.

At the time of the GSLV Mk-III launcher’s first test flight in December 2014,  K. Radhakrishnan, who headed ISRO at the time, had said that the agency was envisaging a human spaceflight programme commencing in 2021, at the cost of Rs.12,400 crore. And to manufacture a space-borne capsule that is ‘human-ready’, it must just as well be ‘Sun-proof’. The knowledge necessary for such engineering will siphon data from MAST as well as ADITYA and the NASA STEREO telescopes.

Probe encounters glitch 10 days ahead of historic rendezvous with Pluto

The Wire
July 6, 2015

In the last mile of its 3,464-day journey and only ten days away from a historic rendezvous with the dwarf planet Pluto, the New Horizons probe experienced an anomaly on July 4 and prompted the on-board computer to switch to ‘safe mode’. The event caused a communications blackout between New Horizons and mission control at the Applied Physics Laboratory, Maryland, for 90 minutes on Saturday. Now, the probe is transmitting telemetry signals that will help scientists fix it – hopefully in time for its encounter with Pluto and its moons.

And until it’s fixed, science missions – including the detailed pictures it’s been taking of Pluto and Charon of late – will be on pause. Not surprisingly, the incident will have the scientists and engineers operating the probenervous. As Alan Stern, the mission’s principal investigator, said in June, “There’s only one Pluto flyby planned in all of history, and it’s happening next month!”

New Horizons was launched by NASA on January 19, 2006, with the primary objective of studying Pluto’s surface and atmosphere up-close, as well as observe its moons Charon, Nix, Hydra, Styx and Kerberos. In order to reduce mission costs at the time of launch, New Horizons was not designed to land on Pluto but to fly by it at a distance of about 13,000 km. On planetary scales, that’s small and excellent enough to fetch the dwarf planet out of the blur.

That historic flyby is supposed to happen on July 14. By then, the on-board anomaly needs to be recognized and fixed or the scientists, and humankind, risk losing years of efforts and patient waiting. Nonetheless, if the issue is fixed after the probe has flown past Pluto, it will still be used to study the Plutonian neighbourhood of which we know little. This is the region of space containing the Kuiper Belt objects, a belt of asteroids like the one between the orbits of Mars and Jupiter but over 20-times wider and denser. They comprise the matter leftover after the Solar System’s planets had formed.

According to Emily Lakdawalla, a planetary scientist affiliated with the Planetary Society, the probe is on the right trajectory even on the safe mode. She also wrote that there were no pictures set to be taken by the probe on July 4, but some on July 5 and 6 that might be missed.

Safe modes are not an uncommon occurrence on the computers operating satellites and probes, and even rovers. They are in effect similar to how a computer running the Windows OS sometimes slips into safe-mode, often to eliminate a bug that threatens some critical function, by reverting to a very primitive state conducive to troubleshooting.

In March 2013, the Curiosity rover on Mars entered into safe-mode following a computer glitch. In the next two days, its controllers transmitted the necessary code for the software running the rover to fix itself, and the rover was back online again. More recently, in April 2015, the Rosetta probe that’s tracking comet 67P/C-G went into safe-mode after its computer lost contact with radio signals from Earth, thanks to dust blown from the comet interfering with the antennae.

However, what makes the troubleshooting tricky is that New Horizons is 4.8 billion km away – a distance that radio signals take 4.5 hours to travel. This means the total time taken for mission control to send a message to New Horizonsand receive a reply is nine hours, and that the problem is likely to be fixed over the course of the next few days. Until then, let’s keep out fingers crossed.

Update: At 8 am (IST) on July 6, NASA put out a statement saying the problem in the computer had been resolved and that New Horizons would be able to revert to its original science plan on July 7.

The investigation into the anomaly that caused New Horizons to enter “safe mode” on July 4 has concluded that no hardware or software fault occurred on the spacecraft. The underlying cause of the incident was a hard-to-detect timing flaw in the spacecraft command sequence that occurred during an operation to prepare for the close flyby. No similar operations are planned for the remainder of the Pluto encounter.

It added that the down-time will have had a minor effect on the probe’s science objectives in the two days.

Petition asks why Aadhar is a must to unlock Modi’s DigiLocker

The Wire
July 4, 2015

New Delhi: Of all the schemes of the previous Manmohan Singh government, the Aadhar UID program is one that Narendra Modi is the most committed to. On July 1, he flagged off a ‘digital locker’ service for the country as part of his Digital India initiative. According to its website, DigiLocker provides each user with 10 MB of storage space on the web to store and share files as well as makes for a hub on which to access various government documents.

The catch? A user can only sign up using an Aadhar UID number. Lawyers say this is a violation of the Supreme Court’s 2013  order prohibiting the government from making Aadhar compulsory for accessing any public service. Upset by the denial of the DigiLocker facility to those those without a UID, Sudhir Yadav has filed a petition in the Supreme Court calling for “exemplary punishment” of those responsible for this. The DigiLocker service comes from the Department of Electronics and Information Technology, under the Ministry of Communications & IT. It offers 10 MB of free storage (with an upgrade to 1 GB hinted at), allows pdf, jpg, jpeg, png, bmpg and gif file formats, and stipulates that no single file can be larger than 1 MB. This paltry storage offering is, however, masked by a bigger concern.

While Modi has frequently used the social media as part of his communication strategy, as well as exhibited some appreciation of technology in his governance, he has stayed away from pushing through legislation on privacy of public data.

In the DigiLocker initiative, there is no clarity about whether the government can access the information stored in the lockers, even as a technical documentaccompanying the release states: “It is important to mandate use of Aadhaar number in all resident documents to strongly assert ownership”. Troublingly, the same document goes on to say, “… some document types may be available to ‘trusted’ requesters without electronic authentication and authorisation of the owner (a simple consent may suffice)”.

“Either the Modi government is kind of slow on the uptake and hasn’t yet understood that the Supreme Court has thrice – on September 23, 2013, March 24, 2014 and March 16 2015 – said that services cannot be made incumbent on the UID, or it is telling the Supreme Court that it does not care what the court says and that it will act as it pleases,” says Usha Ramanathan, an independent researcher who has been investigating the UID project since 2009. “The court had also said that the government must change its forms and circulars to make that clear.”