(Current Affairs For SSC Exams) Sci & Tech, : May, 2013

Science & Technology

May 2013

DNA as an Information Storage Device

Since time immemorial, mankind has wanted to share and use information for later use. First, it was through the caveman paintings and symbols. Then we invented the alphabets, ideograms, numbers and other symbols. Using these, books were written and stored for future generations, in palm leaves, papyrus sheets or paper. The invention of printing brought the Gutenberg revolution, making multiple copies easily and spreading education to millions of people. Printed books occupy space. Libraries and archives are bursting at the seams. Enter the computer age and digitization using the binary code of combining zeros and ones (0,1) for alphabets and other such symbols, and reading them using the on-off electrical signals, which has made electronic storage possible, cutting down the size and space for ‘hard copies’. Integrated circuits, processors and related electronic wizardry have shrunk the size of computers and storage devices from room-size to finger nail size. But even so, the amount of information storable in a given ‘hard drive’ (from a printed book to an Amazon or Kindle e-book, or the Encyclopaedia Britannica to Google) is growing exponentially. “That means the cost of storage is rising but our budgets are not”, as Dr. Nick Goldman of the European Bioinformatics Institute at Hinxton, UK told The Economist (in its January 26, 2013 issue). Goldman (together with 4 colleagues at Hinxton and 2 from Agilent Technologies, California, U.S.) decided to use DNA (yes, the molecule which stores the code to make life possible) as the information storage device, rather than electronics. Their paper titled “Towards practical, high-capacity, low maintenance information storage in synthesized DNA” has just been published in the journal Nature two weeks ago (doi:10.1038/nature 11875).

Why DNA? Indeed the question should be ‘why not DNA”. It is a long chain, consisting of 4 alphabets (chemical units called bases and referred to as A, G, C and T) put together in a string of sequence — similar to what the English language does with its 26 alphabets and punctuation marks, or digital computers with the combination of zeros and ones in chosen sequences. DNA has been used since life was born over 2 billion years ago to store and transfer information right through evolution. It is small in size — the entire information content of a human is stored in a 3 billion long sequence of A, G, C and T, and packed into the nucleus of a cell smaller than a micron (thousandth of a millimetre). It is stable and has an admirable shelf life. People have isolated DNA from the bones of dinosaurs dead about 65 millions ago, read the sequence of bases in it and understood much information about the animal. The animal (shall we say the ‘host’ of the DNA) is long since dead but the information lives on. DNA is thus a long-lived, stable and easily synthesized storage hard drive. While the current electronic storage devices require active and continued maintenance and regular transferring between storage media (punched cards to magnetic tapes to floppy disks to CD...), DNA based storage needs no active maintenance. Just store in a cool, dark and dry place! The Goldman group is not the first one to think of DNA as a storage device. Dr E.B. Baum tried building an associative memory vastly larger than the brain in 1995, Dr C.T. Clelland and others ‘hid’ messages in DNA microdots in 1999, JPL Cox wrote in 2001 on long-term data storage in DNA, Allenberg and Rotstein came up with a coding method for archiving text, images and music characters in DNA, and in 2012 Church, Gao and Kosuri have discussed the next-generation digital information storage in DNA.

What is novel in the Hinxton method is that they moved away from the conventional binary (0 and 1) code and used a ternary code system (three numerals 0, 1 and 2 using combinations of the bases A, G, C and T) and encode the information into DNA. This novelty avoids any reading errors, particularly when encountering repetitive base sequences. Also, rather than synthesize one long string of DNA to code for an entire item of information, they broke the file down to smaller chunks, so that no errors occur during synthesis or read-out. These chunks are then read in an appropriate manner or protocol, providing for 100 per cent accuracy. How much information can be stored in DNA? Goldman and co have been able to store 2.2 petabytes (a peta is a million billion or 10 raised to power 15) in one gram of DNA (and as The Economist says “enough, in other words, to fit all of the world’s digital information into the back of a lorry”). What about the speed? And how does one read the files? Today, the speed is slow and the reading using DNA sequencers is expensive, but in time both the speed will improve and the cost come down considerably. Recall that it took $3 billion to read out the entire human genome a decade ago, and months to do so. Today, the speed has improved, and it is predicted that in a couple of years, the human genome can be read for $1000. But even today, DNA–based information storage is a realistic option to archive long-term, infrequently accessed material. What did Goldman and group store in DNA? For starters, they stored all 154 sonnets of Shakespeare (in ASCII text), the 1953 Watson-Crick paper on the DNA double helix (in PDF format), a colour photograph of Hinxton (in JPEG) and a clip from the “I have a Dream” speech of Martin Luther King (in MP3 format). Natural selection and evolution have used DNA to store and read out to make our bodies. And we are now using DNA to store and archive the products of our brains. What a twist!

India Ranked Third on the List of Spam Spewing Nation in the World

India ranked third on list of countries which distribute spam all over the world, after US and China, as per the new report of SophosLabs. SophosLabs is the Sophos’s global network of threat analysis centres. In the study, US was single highest ranking country, but Asia ranked at number 1 position in the list of the continents with 36.6 percent of overall spam of the world.

US sent 18.3 percent of the junk emails overall in the world. In last few months of 2012, India topped the spam spewing nations’ list, but eventually fell back to the third position. Second position was held by China. In the study, the spam sent from December 2012 to February 2013 was tracked. China and India took second and third positions respectively with 8.2 percent and 4.2 percent of the spam of the world.

Uterine Contractions Explained through a Physical Model

Synchronised oscillations that take place in biological systems have elicited a lot of interest and study. In this context, the behaviour of the uterus towards late pregnancy and close to labour, when it goes into sychronised oscillations seeks to be understood. A paper published in February 2012 in the journal, Physical Review Letters , by a group of scientists from India and France, which includes Rajeev Singh and Sitabhra Sinha of the Institute of Mathematical Sciences, Chennai, gives an theoretical explanation for the observed rhythmic contractions of the uterus during and just before labour. The paper provides the understanding that these synchronised oscillations in the uterus do not arise from any central agency (like the pacemaker cells do for the heart). Instead they emerge due to some electrophysiological changes that take place in the uterus. What these electrophysiological changes are and how they affect the uterus can be understood by considering the structure of the uterine tissue. It consists of electrically excitable smooth muscle cells as well as electrically passive cells. In the tissue, cells are coupled by gap junctions that serve as electrical conductors. These gap junctions have been studied and are found to increase, both in number and value of electrical conductance, markedly during late pregnancy. The correlation between this electrophysiological change and the corresponding tendency of the uterus to go into rhythmic contractions strongly suggests a prominent role of the coupling between the cells.
Hence the group has modelled the uterus as a two-dimensional grid occupied by electrically excitable cells, each of which is coupled to one or more passive cells and to its neighbouring excitable cells with a particular strength of intercellular interaction. Solving the emerging equations they find that when this strength is increased step by step, the system goes through wavelike excitations that “lead to coherent periodic activity, exhibiting cluster, local and global synchronisation under different conditions.” Namely, as the strength of intercellular interaction increases, localised regions oscillating at different frequencies tune in to a single frequency but with a few local regions of inactivity. Further increasing the strength cause the whole system to oscillate in a single wave.

Jeff Bezos Recovered Two Apollo Rocket Engines

In the privately-funded expedition, a recovery team which was funded by Amazon founder Jeff Bezos, recovered two Saturn V rocket engines from beneath the Atlantic Ocean. Announcement of the recovery was made on 20 March 2013. Saturn V rocket engines are the ones which were used to send human beings to Moon from 1967 until 1973.

Jeff Bezos, chief executive of Amazon, in the year 2012 announced about his plans of searching the sea floor of Atlantic Ocean for the rocket motors which plunged back into the Ocean after the Saturn 5 launches to the moon during Apollo 11 moon mission. Two Saturn V first-stage engines were recovered by the Bezos Expeditions from 3 miles beneath Atlantic Ocean.

Saturn V First-stage Rocket Engines

  • Saturn V was the American human-rated expendable rocket which was used by Apollo of NASA and Skylab from 1967 until 1973. NASA launched a total of 13 Saturn Vs from Kennedy Space Center, Florida.

  • It is said to be the heaviest, tallest as well as the most powerful rocket ever brought to the operational status. It holds a record of heaviest launch vehicle payload ever.

  • Up to date, Saturn V is only launch vehicle that transported humans beyond low Earth orbit.

  • The Saturn V first-stage rocket engines were developed by Rocketdyne. These engines were used in Saturn V.

  • First F-1 engines were made use in S-IC first stage of each Saturn V, which was in turn used as the launch vehicle in Apollo Program.

  • F-1 is the most powerful single-chamber liquid-fueled rocket engine.

Natural Gas from Methane Hydrate

Japan on 12 March 2013 announced that it had extracted the natural gas from frozen methane hydrate successfully off the central coast. With this, Japan became the first country to do so in the world. A Japan official from Economy, Trade and Industry Ministry announced that this was the first offshore experiment of producing natural gas from methane hydrate in the world. Methane Hydrate, also known as clathrates are a kind of frozen confinements of molecules of water and methane. The gas field where the natural gas was extracted from methane hydrate is situated 50 km away from the main island of Japan in Nankai Trough.

How was the natural gas from methane hydrate extracted?

The engineers made use of the depressurisation method which converts the methane hydrate into the methane gas. The production tests will continue till the end of March 2013. The government officials of Japan announced that now their aim was establishment of methane hydrate production technologies for the purpose of practical use by 2018.

How will this extraction benefit Japan?

According to the Japanese study, it was estimated that around 1.1tn cubic metres of methane hydrate existed in the offshore deposits. This is equal to the gas consumption of Japan in more than 10 years. According to the researchers, this extraction can prove to be very beneficial because it could facilitate as an alternative source of energy for Japan. It is important to note that Japan as of now imports all the energy needs. Japan has limited natural resources. After the nuclear disaster of Fukushima plant in 2011, cost of importing the fuel has also increased. The extraction of natural gas from methane hydrate will therefore help Japan in reducing the pressure on natural resources as well as bringing down the cost of import of fuel.

Huge Radio Galaxy Discovered

An international team of astronomers led by ASTRON astronomer Dr. George Heald, in the third week of March 2013 discovered an unknown huge radio galaxy, by making use of the powerful International LOFAR Telescope (ILT), built by ASTRON. The galaxy was discovered in the LOFAR’s first all-sky imaging survey called Multi-frequency Snapshot Sky Survey (MSSS). The new source which was equivalent to the size of full Moon was identified while analysis of the MSSS images. The radio emission was observed, which is associated with the material that is ejected from one of the members of interacting galaxy triplet system. Physical extent of this material is said to be much larger than system of galaxy, which actually extends millions of light years across the intergalactic space. It is important to note that the MSSS is still under process.

What is ASTRON?

ASTRON stands for Dutch Stichting ASTRonomisch Onderzoek in Nederland. It is actually the Dutch foundation that conducts research in radio astronomy. Radio astronomy is basically the subfield of astronomy which conducts study of the celestial objects located at the radio frequencies.

What is LOFAR?

LOFAR stands for Low-Frequency Array for radio astronomy. It was built as well as designed by Netherlands astronomical foundation called ASTRON. Its operations are managed by the ASTRON’s radio observatory. LOFAR is basically the largest connected radio telescope built by making use of the new concept based on range of omni-directional antennas.

What is Multi-frequency Snapshot Sky Survey (MSSS)?

The MSSS conducted by LOFAR is determined effort to analyse the northern sky at extremely low radio frequencies, between wavelengths varying from 2m to 10m. Primarily, the aim of this survey is performing the initial scan of sky for creating the all-sky model which will in turn help in deeper observations.

Web Observatory for Cybergazing

How right is it to put your content on the Web behind paywalls when there is no fee for posting information on the Web? “Sir Tim Berners-Lee’s [inventor of the World Wide Web] thesis is Web has to be free… either everybody uses it or nobody uses it,” said Prof. Dame Wendy Hall, Dean of Physical and Applied Sciences at the University of Southampton. “Because the Web is free, people are using it.” Prof. Hall was recently in Chennai to attend the Association for Computing Machinery (ACM) conference. After all, the world would have been very different today had Sir. Berners-Lee tried making money out of the Web. “We can’t revisit his experiment. We can’t rerun it. But we can kill the way it works at the moment,” she said emphatically. After all, the Web can work efficiently only if many use it and in order to make that happen, it has to be given free.

She cites the example of the Times which went behind paywall in mid 2010. It does make money from the sizeable subscribers. But the consequence is that the number of people reading it is very limited. “The model is wrong,” Prof. Hall said. “It doesn’t mean there aren’t ways of making money.” “To me, having a newspaper behind paywall is just nonsense,” she said. Prof. Hall must know best as she has been a part of the Web revolution. She was a member of the team that digitised photos, videos and audio content. She was at the centre of the multimedia and hypermedia revolution. She is also the President of the Association for Computing Machinery (ACM), the first person to become one outside North America.

One of her pet projects is the Web observatory. According to her, it is akin to astronomers looking at the sky to understand how the universe started and how the planets evolved. “The Web Observatory is a big analytics platform. It is not about data, but how people are using the data and behaving on the Web,” she explained. “Our thesis is, in order to study the Web, you need to observe what happens on the Web. To do this one has to study it every day to understand the “dynamics of the Web and the interaction with technology, and what people do with it.” It is basically to do with analysing the data to find out how things evolve. The classic example is the twitter — on who is influencing who and how things evolve in the microblog. ”We have found some interesting similarities and differences of how twitter is evolving in different regions,” Prof. Hall revealed. “It reflects to some extent the culture.”

ISRO Launched SARAL and Six Other Satellites

Six foreign micro and mini spacecrafts as well as the Indo-French oceanographic study satellite, SARAL (Satellite for Argos-3 and Altika) were launched successfully by ISRO’s PSLV-C20 rocket on 25 February 2013 from the spaceport at Sriharikota. The Polar Satellite Launch Vehicle (PSLV) of Indian Space Research Organization’s (ISRO) flew from first launch pad of Satish Dhawan Space Centre and was successfully put into the orbit. The President of India, Pranab Mukherjee witnessed this launch from mission control centre at Sriharikota.

Satellites Launched by PSLV-C20 Rocket

  • 410-kg SARAL with payloads - Argos and Altika

  • Two micro-satellites UniBRITE and BRITE from Austria

  • AAUSAT3 from Denmark

  • STRaND from United Kingdom

  • Micro-satellite (NEOSSat)

  • Mini-satellite (SAPPHIRE) from Canada

These seven satellites were launched successfully by PSLV which depicted its versatility and recorded 22nd successful flight consecutive. ISRO also has plans to launch the Geo-Synchronous Satellite Launch Vehicle that has indigenous cryogenic engine, along with the India’s mission to Mars, both of which are scheduled for 2013.

SARAL, the Unique Satellite

  • SARAL is said to be a very unique satellite which serves the research community.

  • This satellite will assist the researchers in oceanographic studies.

  • SARAL will study the sea surface heights and ocean currents.

  • ARGOS-2 will help in collecting the data, while Altikameter will be responsible for measuring the height of sea surface.

  • SARAL will also assist researchers in studying about the climatic developments.

  • Its practical applications include study of migration of the marine animals, continental ice studies, protection of biodiversity and coastal erosion.

A Submerged Continent Found

A group of scientists from Norway, Germany, South Africa and the U. K. have discovered a submerged continent in the Indian Ocean. Their measurements predict that the continent, which they have named Mauritia, lies under Mauritius and extends more than 1,000 km northwards till Seychelles. The discovery was sparked when they found crystals called zircons on Mauritian beaches. Zircons are resistant to erosion or chemical change and some of the ones they found were almost two billion years old, much older than any of the regular soil or sand samples found on nearby islands. Such old crystals, they thought, could only belong to a submerged continent, and may have perhaps been pushed up on the surface by underwater volcanoes. To confirm whether these zircons indeed belonged to such a continent, they consulted satellite data which can help detect submerged land masses. But underneath Mauritius and leading to Seychelles, which is more than 1,000 kilometers away, there were large chunks of the crust that were as thick as 30 km.

Moles Sniff in Stereo, Experiment Shows

Moles need both nostrils to locate food underground, in the way that humans see and hear in stereo, according to research reported on Tuesday. The common mole (Scalopus aquaticus) has tiny eyes tucked between fur and skin and is nearly blind, with small ears attuned only to low frequency sounds. Curious to understand how the little creature finds food in the dark, biologist Kenneth Catania at Vanderbilt University in Nashville, Tennessee, created a plexiglass chamber in his lab. The box had 15 holes arranged in a semi-circle in the floor, a different one of which was randomly filled with a tempting piece of earthworm. The chamber was sealed so that Catania could detect, by minute changes in air pressure, every time the mole sniffed. He also filmed the creatures’ movements with a high-speed camera. “It was amazing,” Catania said. “They found the food in less than five seconds and went directly to the right food well almost every time. They have a hyper-sensitive sense of smell.”

In the next step, the scientist blocked one of each mole’s nostrils with a small polyethylene tube. The animals veered off to the opposite side of whichever nostril was obscured, but eventually found the food. Finally, Catania inserted small plastic tubes in both nostrils but crossed them over, so that the right nostril was smelling air on the animal’s left and the left nostril was smelling air on the animal’s right. With nostrils crossed, the moles crawled backwards and forth, searching for a reward they could smell but, bafflingly, could not locate. “The fact that moles use stereo odour cues to locate food suggests others mammals that rely heavily on their sense of smell, like dogs and pigs, might also have this ability,” Catania said.

Middle East has lost 144 Cubic km of Sater

The Middle East has lost 144 cubic km of water between 2003 and 2010, nearly equal to the staggering volume of the Dead Sea, show data provided by NASA satellites. Four countries of the region along the Tigris and Euphrates — Turkey upstream to Syria, Iran and Iraq below — alone account for the unprecedented loss.

University of California-Irvine scientists and colleagues say the Tigris-Euphrates watershed is drying up at a pace second only to that in India. “This rate is among the largest liquid freshwater losses on the continents,” they say, the journal Water Resources Research reports. Water management is a complex issue in the Middle East, “a region that is dealing with limited water resources and competing stakeholders,” says Katalyn Voss, water policy fellow with the California’s Centre for Hydrologic Modeling at Irvine, who led the study, according to a California statement. Turkey has jurisdiction over the Tigris and Euphrates headwaters, as well as the reservoirs and infrastructure of its Southeastern Anatolia Project, which dictates how much water flows downstream into Syria, Iran and Iraq.

Detailed Heart Atlas Created

Researchers from the Pompeu Fabra University in Spain have created a high resolution atlas of the heart based on 3D images taken from 138 people. “This atlas is a statistical description of how the heart and its components — such as the ventricles and the atrium — look,” Corne Hoogendoorn, researcher at Pompeu Fabra University’s CISTIB centre, told SINC, the news agency of the Spanish Foundation for Science and Technology FECYT. The study can be applied to medical imaging, especially when segmenting, or in other words, properly differentiating a structure to be analysed from the rest of the image, the journal IEEE Transactions on Medical Imaging , reports. The level of detail and the possibility to extend the atlas give it “an advantage over the majority of cardiac models present to date”, adds Hoogendoorn, according to a Pompeu Fabra statement. Pompeu Fabra scientists have managed to create a representation of the average shape of the heart and its variations with images from 138 fully functioning hearts taken using multi-slice computed tomography. This technique offers both 3D and high resolution X-ray. To create this cardiac map, researchers developed a statistical model capable of managing high quantities of information provided by individual images. It can also collect temporary variations, given that the heart is never motionless.

Bhabha Atomic Research Centre Developing Largest Magnet of the World

Bhabha Atomic Research Centre, which is the state-owned centre of India, announced that it was developing the largest magnet of the world. This largest magnet would weigh 50000 tons. The magnet is said to be much bigger than the one which is found at Compact Muon Solenoid detector at CERN in Geneva. This magnet will play a crucial role in the India-based Neutrino Observatory which will come up almost 4300 feet below the cave in the mountain in Tamil Nadu. The Head of the atomic research centre’s nuclear physics division announced that the magnet to be developed by the Bhabha Atomic Research Centre would be the largest in the world in terms of its dimensions. The magnet will be iron-based and will weigh 50000 tons. The weight of the magnet at CERN ranges somewhere between 4000 and 5000 tons.

Science & Technology

May 2013

15 New Very Young Stars Named as Protostars Discovered

Astronomers at the Herschel space observatory in the Month of March 2013 found some of the youngest stars ever seen in the Universe. The findings of new stars which is known by protostars was also contributed by Observations from NASA’s Spitzer Space Telescope and the Atacama Pathfinder Experiment (APEX) telescope in Chile, a collaboration involving the Max Planck Institute for Radio Astronomy in Germany, the Onsala Space Observatory in Sweden, and the European Southern Observatory in Germany. It has been observed that dense envelopes of gas and dust surround the fledging stars known as protostars, making their detection difficult. The 15 newly observed protostars turned up by surprise in a survey of the biggest site of star formation near our solar system, located in the constellation Orion. The finding of these new stars is giving scientists a glance into one of the earliest and least understood phases of star formation and it can be a witnessing moment of the phases when a star begins to form. Astronomers long had investigated the stellar nursery in the Orion Molecular Cloud Complex, a vast collection of star-forming clouds, but had not seen the newly identified protostars until Herschel observed the region.

A brief insight of the Observation made by Herschel Space Observatory

  • Herschel spied the protostars in far-infrared, or long-wavelength, light, which can shine through the dense clouds around burgeoning stars that block out higher-energy, shorter wavelengths, including the light our eyes see.

  • The Herschel Photodetector Array Camera and Spectrometer (PACS) instrument collected infrared light at 70 and 160 micrometers in wavelength, comparable to the width of a human hair. Researchers compared these observations to previous scans of the star-forming regions in Orion taken by Spitzer.

  • Extremely young protostars identified in the Herschel views but too cold to be picked up in most of the Spitzer data were further verified with radio wave observations from the APEX ground telescope.

  • Of the 15 newly discovered protostars, 11 possess very red colors, meaning their light output trends toward the low-energy end of the electromagnetic spectrum. This output indicates the stars are still embedded deeply in a gaseous envelope, meaning they are very young.

Antarctica needs MPAs

The Commission for Conservation of Antarctic Marine Living Resources (CCAMLR) in its meeting from October 23 to November 1, 2012, failed to deliver any agreement on marine protected areas (MPAs) for Antarctica’s Southern Ocean. CCAMLR, made up of 24 countries and the European Union, had been considering proposals for turning two critical areas in Antarctica’s Southern Ocean into MPAs at the meeting, including 1.6 million square kilometres of the Ross Sea, the world’s most intact marine ecosystem, and 1.9 million square kilometres of coastal area in the East Antarctic. Initially there were two proposals for the Ross Sea, one submitted by the US and one by New Zealand.

At the 2012 meeting, Russia, China and the Ukraine blocked efforts to put conservation in place. Previously, CCAMLR members committed to beginning to establish a network of MPAs in the Southern Ocean in 2012. There was no good scientific reason for not meeting that commitment. The CCAMLR process requires the Science Committee to first review all proposals before bringing them for discussion and approval by the Commission members who take the policy decisions. In the case of the two recent marine protection proposals for the Ross Sea and East Antarctica, the Science Committee had reviewed the science and passed them to the Commission for decisions. However, the nations that opposed the MPAs — China, Russia and the Ukraine, claimed that amendments in those proposals meant that they should be resubmitted to the Science Committee. How is the meeting in Germany going to be different from what has been discussed in previous meetings, including the last one in Hobart, Australia? And if so, what are they? Ms. Mattfield, notes: “There is a chance that the proposals will again be amended as part of this process and the Antarctic Ocean Alliance urges member nations to remember the conservation objective of the Commission and ensure that the proposals offer the protection needed.

Higgs Boson Closer than Ever

Ever since CERN announced that it had spotted a Higgs boson-like particle on July 4, 2012, their flagship Large Hadron Collider (LHC), apart from similar colliders around the world, has continued running experiments to gather more data on the elusive particle. The latest analysis of the results from these runs was presented at a conference now underway in Italy. While it is still too soon to tell if the one spotted in July 2012 was the Higgs boson as predicted in 1964, the data is convergent toward the conclusion that the long-sought particle does exist and with the expected properties. More results will be presented over the upcoming weeks. In time, particle physicists hope that it will once and for all close an important chapter in physics called the Standard Model (SM). The announcements were made by more than 15 scientists from CERN on March 6 via a live webcast from the Rencontres de Moriond, an annual particle physics forum that has been held in La Thuile, Italy, since 1966. “Since the properties of the new particle appear to be very close to the ones predicted for the SM Higgs, I have personally no further doubts,” Dr. Guido Tonelli, former spokesperson of the CMS detector at CERN, told The Hindu . Interesting results from searches for other particles, as well as the speculated nature of fundamental physics beyond the SM, were also presented at the forum, which runs from March 2-16.

A Precise Hunt

A key goal of the latest results has been to predict the strength with which the Higgs couples to other elementary particles, in the process giving them mass. This is done by analysing the data to infer the rates at which the Higgs-like particle decays into known lighter particles: W and Z bosons, photons, bottom quarks, tau leptons, electrons, and muons. These particles’ signatures are then picked up by detectors to infer that a Higgs-like boson decayed into them. The SM predicts these rates with good precision. Thus, any deviation from the expected values could be the first evidence of new, unknown particles. By extension, it would also be the first sighting of ‘new physics’.

Good and Bad News

After analysis, the results were found to be consistent with a Higgs boson of mass near 125-126 GeV, measured at both 7- and 8-TeV collision energies through 2011 and 2012. The CMS detector observed that there was fairly strong agreement between how often the particle decayed into W bosons and how often it ought to happen according to theory. The ratio between the two was pinned at 0.76 +/- 0.21.

Dr. Tonelli said, “For the moment, we have been able to see that the signal is getting stronger and even the difficult-to-measure decays into bottom quarks and tau-leptons are beginning to appear at about the expected frequency.” The ATLAS detector, parallely, was able to observe with 99.73 per cent confidence-level that the analysed particle had zero-spin, which is another property that brings it closer to the predicted SM Higgs boson. At the same time, the detector also observed that the particle’s decay to two photons was 2.3 standard-deviations higher than the SM prediction. Dr. Pauline Gagnon, a scientist with the ATLAS collaboration, told this Correspondent via email, “We need to asses all its properties in great detail and extreme rigour,” adding that for some aspects they would need more data. Even so, the developments rule out signs of any new physics around the corner until 2015, when the LHC will reopen after a two-year shutdown and multiple upgrades to smash protons at doubled energy. As for the search for Super symmetry, a favoured theoretical concept among physicists to accommodate phenomena that haven’t yet found definition in the Standard Model: Dr. Pierluigi Campana, LHCb detector spokesperson, told The Hindu that there have been only “negative searches so far”.
Supervolcano

Forming under the Pacific

Life on Earth could be facing threat from a catastrophic “supervolcano” which seismologists believe is due to erupt in 200 million years’ time. At least two “piles” of rock the size of continents are crashing together as they shift at the bottom of Earth’s mantle, 2,900 km beneath the Pacific Ocean, researchers say. “What we may be detecting is the start of one of these large eruptive events that — if it ever happens — could cause very massive destruction on Earth,” said seismologist Michael Thorne, the study’s principal author and an assistant professor of geology and geophysics at the University of Utah. However, disaster is not imminent. “This is the type of mechanism that may generate massive plume eruptions,” he adds. The new study, published in the journal Earth and Planetary Science Letters , said the activity is creating a Florida-sized zone of partly molten rock that may be the root of either of two kinds of massive eruptions far in the future. Hotspot plume supervolcano eruptions have caused huge landforms. Gargantuan flood basalt eruptions that created “large igneous provinces” like the Pacific Northwest’s Columbia River basalts 17 million to 15 million years ago, India’s Deccan Traps some 65 million years ago and the Pacific’s huge Ontong Java Plateau basalts, which buried an Alaska-sized area 125-199 million years ago.

Since the early 1990s, scientists have known of the existence of two continent-sized “thermochemical piles” sitting atop Earth’s core and beneath most of Earth’s volcanic hotspots — one under much of the South Pacific and extending up to 20 degrees north latitude, and the other under volcanically active Africa. Using the highest-resolution method yet to make seismic images of the core-mantle boundary, the team found evidence the pile under the Pacific actually is the result of an ongoing collision between two or more piles. Where they are merging is a spongy blob of partly molten rock the size of Florida, Wisconsin or Missouri beneath the volcanically active Samoan hotspot. The study’s computer simulations “show that when these piles merge together, they may trigger the earliest stages of a massive plume eruption,” Thorne said.

ISRO Plans a New High-resolution Earth Satellite

The Indian Space Research Organisation is to build a remote sensing satellite, Cartosat-3, capable of taking images of the earth with a resolution of 0.25 metres. Currently, GeoEye-1 produces the highest resolution earth images taken by a commercial satellite. The American spacecraft, launched in September 2008, is capable of taking panchromatic images with 0.41 metre resolution. WorldView-2, another satellite operated by the same company, DigitalGlobe, offers a best resolution of 0.46 metres. However, in accordance with U.S. regulations, commercially released images from these satellites are degraded to 0.5 metre resolution. DigitalGlobe plans to launch WorldView-3 next year, which will supply images with a resolution of 0.31 metres. Cartosat-3’s camera would better that performance. In the words of one expert, this satellite’s images could allow a scooter to be distinguished from a car. In the ‘Notes on Demands for Grants, 2013-2014’ from the Department of Space, which forms part of the budget documents presented to Parliament recently, Cartosat-3 figures as a separate item with an allocation of Rs. 10 crores. “Cartosat-3 is an advanced remote sensing satellite with enhanced resolution of 0.25 metre for cartographic applications and high-resolution mapping,” the document said. IN 1988, ISRO launched India’s first operational remote-sensing satellite, IRS-1A. The best resolution its cameras could provide was about 36 metres. Seven years later, IRS-1C went into space, with a panchromatic camera that had a resolution of 5.8 metres. It supplied the highest resolution images available from any civilian satellite in the world till Ikonos, an American satellite launched in 1999, began taking images with better than one-metre resolution. India launched the Technology Experiment Satellite in 2001, followed some years later by the Cartosat-2 series of satellites that could take images with 0.8 metre resolution.

Researchers Calculated Exact Distance to Our Closest Neighbouring Galaxy

Researchers working on the accurate calculation of the distance of Milky Way to the nearest galaxy, led by Grzegorz Pietrzynski of the Universidad de Concepcion in Chile and Warsaw University Observatory in Poland, found the exact distance to our nearest galaxy. The nearest galaxy to Milky Way is called Large Magellanic Cloud (LMC) and it was found that it lies at a distance of 163000 light years away or exactly 49.97 kiloparsecs.

Large Magellanic Cloud (LMC)

  • Large Magellanic Cloud (LMC) is the dwarf galaxy which floats in the space around our galaxy, Milky Way. It floats in a similar trend like that between the Earth and the Moon.

  • Large Magellanic Cloud (LMC) encompasses huge clouds of gas in it, which gradually collapse, thereby forming new stars. These new stars are brightened in the colours which are actually visible in the images that are taken by Hubble Space Telescope.

  • LMC also includes Tarantula Nebula which is the brightest stellar nursery in the cosmic neighbourhood.

Importance of the Findings

The findings about the distance of LMC from Milky Way are crucial because they can help in determining the scale of our universe, which has remained a mystery ever since its inception. These findings could also be used for determining the rate of expansion of the universe. This rate of expansion of the universe is called Hubble Constant, which is named after an astronomer Edwin P Hubble who discovered in 1929 that the Universe was growing continuously. Determination of the Hubble Constant is highly important for finding out the age as well as size of the universe. Exact distance of Milky Way to LMC has always remained one of the hugest uncertainties which affected the past measurements.

How was the research made?

Lead researcher Grzegorz Pietrzynski declared that they would now work on improving the accuracy of the measurements even more. The calculations of the distance were made by observation of the rare close pairs of stars which are called eclipsing binaries. Eclipsing binaries are actually bound to each other gravitationally. Once per orbit, the overall brightness from this system of stars drops as one of these stars eclipses its partner. It is possible to find out the hugeness of the stars as well as information of their orbits by tracking changes in the brightness carefully as well as by measuring orbital speeds of stars. By combining this with the measurement of apparent brightness, it is possible to determine absolutely accurate distances. In the study, a sample of stars that had extremely long orbital periods was observed for 16 long years. These extremely long orbital periods are absolutely perfect in order to calculate the precise distances. New measurements are useful for decreasing the uncertainty of calculating Hubble Constant to 3 percent while improving it to uncertainty of 2 percent in years to come.

A Trial Drug Raises hope to Eradicate Malaria

A candidate drug (ELQ-300) was found capable of treating and preventing malaria infection, and even blocking transmission during a trial on mice. While the currently available drugs target the parasite at the blood stage of infection, the candidate drug was able to target both the liver and blood stages. Going beyond destroying the parasite in the body, the drug (quinolone-3-diaryether) was found to be effective in preventing infection by attacking the parasite forms that are crucial to disease transmission (gametocytes, and the vector stages — zygote, ookinete and oocyst). “ELQ-300 has potential as a new drug for the treatment, prevention, and, ultimately, eradication of human malaria,” notes a paper published today (March 21) in the Science Translational Medicine journal. The Editor’s summary also underlines the same message: “ELQ-300…[can] prevent and treat malaria, with the potential to aid in eradication of the disease.” Any drug that does even half of what ELQ-300 is capable of will be a boon — nearly 200 million people in the world suffer from malaria every year, and the mortality is as high as 1.2 million. To make matters worse, resistance to currently available drugs is emerging. Two candidate drugs — ELQ-300 and P4Q-391 — were tested against both Plasmodium falciparum and Plasmodiumvivax species . Isolates of P. falciparum and P.vivax taken from patients infected with malaria in southern Papua, Indonesia were tested using both the drug candidates. ELQ-300 was found to be superior against both drug-resistant species.

Abundant Active Bacteria Community Discovered

A team of researchers, led by Ronnie Glud of University of Southern Denmark discovered that a huge community of bacteria grows in depths of the Mariana Trench off the coast of the Pacific Ocean’s Mariana Islands. It was found that the organisms live at the densities ten times higher than shallower ocean floor at rim of trench. The deepest point on entire seafloor is called The Challenger Deep and it is situated in Mariana Trench off the coast of the Pacific Ocean’s Mariana Islands. This point is 36000 feet or 7.8 miles below the surface of the ocean.

How was the research done?

In order to explore the ecosystem that exists ultra-deep, the international team of researchers sent the specially-designed 1300 pound robot in the depth of the Mariana Trench in 2010. This robot was facilitated with thin sensors which could enter into the seafloor sediments in order to measure organic consumption of oxygen. Because all living organisms consume oxygen while respiration, therefore it is possible to find out the amount of microorganisms living in an area by checking the tallies on what quantity of ambient oxygen is missing from the sediments. The team of researchers used the device for sampling the sediments at two sites with depths of 35476 and 35488 feet. It was found that large quantity of oxygen consumption took place. This indicated that there were ten times more bacteria at the ultra-deep site than the shallower site which was sampled just for reference around 37 miles away, at a depth of merely 19626 feet.

What did the specially-designed 1300 pound robot explore?

The robot brought out an overall 21 sediment cores from these two deep sites. The sediment cores were kept for analysis in the lab. Even though a lot of microorganisms died after being brought out to the surface, but it confirmed the finding that cores from Mariana Trench were habitat to higher densities of bacterial cells than the ones which existed in the reference site. Also, the video recording of the ocean floor was done by making use of the lights for illuminating the dark environment. It was also discovered that certain life forms which were larger than the bacteria on the top of the sediment, existed. It was determined that these life forms were Hirondellea gigas, a species of amphipods. Amphipods are the small crustaceans which are just less than one inch in terms of length.

Importance of the Research

The finding of abundant bacterial life at such a depth is very surprising because it was believed that at such depths, not enough nutrients can be found. The Photosynthetic plankton can act as a nutrient base for almost all the ocean food chain, but even these planktons are unable to survive in lightless seafloor. But this research has amused the scientists because ultra-deep trench was found to be the abode of so much bacterial activity than the shallower reference site just nearby. Since 2010 exploration, the team of researchers has also sent this robot to sample Japan Trench which is roughly 29500 feet deep. The researchers now plan to sample the Kermadec-Tonga Trench which is 35430 feet deep.

First Smartphone into Space Launched Successfully From India

The Surrey Space Centre (SSC) of the University of Surrey on 26 February 2013 announced that STRaND-1, a nano-satellite carrying a smartphone was launched successfully into space from India. With this launch, India became the first country to successfully launch the first smartphone of the world.

About STRaND-1 and Apps on Board

STRaND-1 is the training and demonstration mission which is designed in order to test the commercial shelf technology in space. There are certain applications on board STRaND-1. These applications were designed by the winners of social networking site, Facebook competition which was organised in 2012. STRaND-1 mission weighs 4.3 kg and was launched in 785km Sun-synchronous orbit on the PSLV launcher of ISRO.

  • For example, iTesa will help in recording the magnitude of the magnetic field which surrounds the smartphone during its orbit. The app called iTesa will in turn also help in detection of magnetic oscillations in the upper atmosphere, called Alfven waves.

  • Another app called Scream in Space was designed by the Cambridge University Space Flight. This application will be making use of the speakers of smartphones.

  • Yet another app is called STRAND Data. This application will help in showing the satellite telemetry on the display of the smartphone. This in turn would be visible on the additional camera on-board. This eventually will allow new graphical telemetry for interpreting the trends.

  • 360 app allows taking images with the help of camera of smartphone. This application will also help in using the technology on-board for establishing the position of STRaND-1.

Raising the Bogey of Radiation

The cell phone industry has registered phenomenal growth in India. Cell towers have mushroomed all over the country and led to growing concerns about the health effects of radiofrequency radiation. Agents who masqueraded as ‘experts’ and started selling radiation ‘protective’ screens, fanned the fire. They told that cell tower radiation can cause “sleep disturbances, headaches, fatigue, joint pains, memory loss, increased heart rate.” “...Prolonged exposure to cell tower radiation increases the risk of neurological disorders and cancer,” they said, creating a phobia among the public. They did not agree that since the energy of RF radiation from cell phone towers is not enough to break chemical bonds in DNA molecules, it cannot cause cancer. While most countries accepted the guidelines of the International Commission on Non Ionizing Radiation Protection (ICNIRP), India enforced one tenth of the ICNIRP guidelines from Sept 1, 2012, based on the advice from an Inter Ministerial Committee. India’s guidelines have a safety factor of 500. An agent claimed that Mumbai with too many cell towers is like an open microwave oven. He claimed that by accepting the ICNIRP guidelines we are accepting that a child can be safely kept in a microwave oven for 19 minutes a day! Actually, the possible temperature increase of a human body at ICNIRP level will be 0.1Deg C; at DOT levels, 0.01 Deg C. They claimed that Specific Absorption Rate (SAR) limit for cell phones — a safety standard of 1.6 W per kg — is actually for six minutes per-day usage! So do not use phones for more than 18-20 minutes daily, they asserted . Many reporters publicized these scary sound bites. A cell phone kept near the ear will cause a small increase in temperature in regions close to the phone. The more regulatory mechanisms such as blood flow remove the heat establishing equilibrium in about six minutes. Thereafter, there will not be any increase in temperature. The six-minute interval is the time the body’s defence takes to reach equilibrium temperature.

NASA’s Swift Satellite Discovered one of the Youngest-known Supernova Remnants

NASA’s Swift satellite in the Month of March 2013 discovered one of the youngest-known supernova remnants which is believed to be less than 2500 years old - in our Milky Way galaxy. The Supernova Remnants were discovered while performing an extensive X-ray survey of our galaxy’s central region. Looking after the coordinates of its sky position it has been designated G306.3-0.9. As per the analysis by the scientist it was indicated that G306.3-0.9 is likely less than 2500 years old, making it one of the 20 youngest remnants identified.

It has been estimated that Astronomers have previously catalogued more than 300 supernova remnants in the galaxy. To further investigate the object, the team followed up with an 83-minute exposure using NASA’s Chandra X-ray Observatory and additional radio observations from the Australia Telescope Compact Array (ATCA), located near the town of Narrabri in New South Wales. Using an estimated distance of 26,000 light-years for G306.3-0.9, the scientists determined that the explosion’s shock wave is racing through space at about 2.4 million km/h. The Chandra observations reveal the presence of iron, neon, silicon and sulfur at temperatures exceeding 28 million C, a reminder not only of the energies involved but of the role supernovae play in seeding the galaxy with heavy elements produced in the hearts of massive stars.

About Supernova Explosion

A supernova explosion occurs once or twice a century in the Milky Way. The expanding blast wave and hot stellar debris slowly dissipate over hundreds of thousands of years, eventually mixing with and becoming indistinguishable from interstellar gas. Earlier in 2011, Swift imaged a survey field near the southern border of the constellation Centaurus. Although nothing unusual appeared in the ultraviolet exposure, the X-ray image revealed an extended, semi-circular source reminiscent of a supernova remnant. A search of archival data revealed counterparts in Spitzer infrared imagery and in radio data from the Molonglo Observatory Synthesis Telescope in Australia.

Path-breaking Move

Call it by whatever name — innovation, revolution or evolution — PeerJ , the new Open Access journal, which published the first 30 peer-reviewed papers on February 12, is breaking new grounds in academic publishing. Hold your breath, a scientific paper in the field of biology and medical sciences can be published for as little as $99. The low publication fee removes one of the last barriers in making OA publishing the most successful model.

The concept of charging a small amount is based on the premise that “if society can set a goal to sequence a human genome for just $99 then why shouldn’t academics be given the opportunity to openly publish their research for a similar amount?” To publish a paper, each author has to be a member. However, when a paper contains 13 or more authors, only 12 authors need to pay the fee. Three membership options are made available — $99 for one publication a year, $199 for two publications a year and $299 for unlimited publications a year. Compare this with other models: subscription journals are behind paywalls and require readers to pay a huge price to read the content. Open Access journals, in general, require authors to pay a certain amount. In the case of PLoS ONE , authors are charged about $1,400 per paper, which is waived in deserving cases. So, compared with these two models, the PeerJ model offers the best of both worlds — Open Access plus very low publication fee. If the Open Access model is experimenting with several options, this one for now, takes the cake.

True, only time will tell if this model will survive and serve the interests of the scientific community and go on to becoming a benchmark for low-cost OA models. According to Nature , every PeerJ member is required to peer review at least one paper a year. By adopting this model, the journal effectively tackles the problem of shortage of peer reviewers. PeerJ has been launched by Jason Hoyt (formerly at Medeley) and Peter Binford (formerly at PLoS ONE ). It has an Editorial Board comprising 800 academics and 20 Advisory Board members. Sharing his experience in The Guardian blog, Micheal Taylor, the author of one of the 30 papers in PeerJ, notes: “In a move towards increasing transparency, the peer reviews, our response letters and the handling editor’s comments are all online alongside the paper [https://peerj.com/articles/36/]. This is good not only because it shows that no corners were cut, but also because the reviewers can receive the credit they deserve for their contributions.”

Gases Work with Particles to Promote Cloud Formation

Researchers have published a study in PNAS showing — for the first time — that certain volatile organic gases can promote cloud formation in a way never considered before by atmospheric scientists.

Go Back To Main Page