Wednesday 29 February 2012

Superfluorescence seen from solid-state material

In a flash, the world changed for Tim Noe -- and for physicists who study what they call many-body problems. The Rice University graduate student was the first to see, in the summer of 2010, proof of a theory that solid-state materials are capable of producing an effect known as superfluorescence.

That can only happen when "many bodies" -- in this case, electron-hole pairs created in a semiconductor -- decide to cooperate.

Noe, a student of Rice physicist Junichiro Kono, and their research team used high-intensity laser pulses, a strong magnetic field and very cold temperatures to create the conditions for superfluorescence in a stack of 15 undoped quantum wells. The wells were made of indium, gallium and arsenic and separated by barriers of gallium-arsenide (GaAs). The researchers' results were reported this week in the journal Nature Physics.

Noe spent weeks at the only facility with the right combination of gear to carry out such an experiment, the National High Magnetic Field Laboratory at Florida State University. There, he placed the device in an ultracold (as low as 5 kelvins) chamber, pumped up the magnetic field (which effectively makes the "many body" particles -- the electron-hole pairs -- more sensitive and controllable) and fired a strong laser pulse at the array.

"When you shine light on a semiconductor with a photon energy larger than the band gap, you can create electrons in the conduction band and holes in the valence band. They become conducting," said Kono, a Rice professor of electrical and computer engineering and in physics and astronomy. "The electrons and holes recombine -- which means they disappear -- and emit light. One electron-hole pair disappears and one photon comes out. This process is called photoluminescence."

The Rice experiment acted just that way, but pumping strong laser light into the layers created a cascade among the quantum wells. "What Tim discovered is that in these extreme conditions, with an intense pulse of light on the order of 100 femtoseconds (quadrillionths of a second), you create many, many electron-hole pairs. Then you wait for hundreds of picoseconds (mere trillionths of a second) and a very strong pulse comes out," Kono said.

In the quantum world, that's a long gap. Noe attributes that "interminable" wait of trillionths of a second to the process going on inside the quantum wells. There, the 8-nanometer-thick layers soaked up energy from the laser as it bored in and created what the researchers called a magneto-plasma, a state consisting of a large number of electron-hole pairs. These initially incoherent pairs suddenly line up with each other.

"We're pumping (light) to where absorption's only occurring in the GaAs layers," Noe said. "Then these electrons and holes fall into the well, and the light hits another GaAs layer and another well, and so on. The stack just increases the amount of light that's absorbed." The electrons and holes undergo many scattering processes that leave them in the wells with no coherence, he said. But as a result of the exchange of photons from spontaneous emission, a large, macroscopic coherence develops.

Like a capacitor in an electrical circuit, the wells become saturated and, as the researchers wrote, "decay abruptly" and release the stored charge as a giant pulse of coherent radiation.

"What's unique about this is the delay time between when we create the population of electron-hole pairs and when the burst happens. Macroscopic coherence builds up spontaneously during this delay," Noe said.

Kono said the basic phenomenon of superfluorescence has been seen for years in molecular and atomic gases but wasn't sought in a solid-state material until recently. The researchers now feel such superfluorescence can be fine-tuned. "Eventually we want to observe the same phenomenon at room temperature, and at much lower magnetic fields, maybe even without a magnetic field," he said.

Even better, Kono said, it may be possible to create superfluorescent pulses with any desired wavelength in solid-state materials, powered by electrical rather than light energy.

The researchers said they expect the paper to draw serious interest from their peers in a variety of disciplines, including condensed matter physics; quantum optics; atomic, molecular and optical physics; semiconductor optoelectronics; quantum information science; and materials science and engineering.

There's much work to be done, Kono said. "There are several puzzles that we don't understand," he said. "One thing is a spectral shift over time: The wavelength of the burst is actually changing as a function of time when it comes out. It's very weird, and that has never been seen."

Noe also observed superfluorescent emission with several distinct peaks in the time domain, another mystery to be investigated.

Monday 27 February 2012

EARTH'S ENERGY BUDGET REMAINED OUT OF BALANCE DESPITE UNUSUALLY LOW SOLAR ACTIVITY

A new NASA study underscores the fact that greenhouse gases generated by human activity -- not changes in solar activity -- are the primary force driving global warming.

The study offers an updated calculation of Earth's energy imbalance, the difference between the amount of solar energy absorbed by Earth's surface and the amount returned to space as heat. The researchers' calculations show that, despite unusually low solar activity between 2005 and 2010, the planet continued to absorb more energy than it returned to space.

James Hansen, director of NASA's Goddard Institute for Space Studies (GISS) in New York City, led the research. Atmospheric Chemistry and Physics published the study last December.

Total solar irradiance, the amount of energy produced by the sun that reaches the top of each square meter of Earth's atmosphere, typically declines by about a tenth of a percent during cyclical lulls in solar activity caused by shifts in the sun's magnetic field. Usually solar minimums occur about every eleven years and last a year or so, but the most recent minimum persisted more than two years longer than normal, making it the longest minimum recorded during the satellite era.

Pinpointing the magnitude of Earth's energy imbalance is fundamental to climate science because it offers a direct measure of the state of the climate. Energy imbalance calculations also serve as the foundation for projections of future climate change. If the imbalance is positive and more energy enters the system than exits, Earth grows warmer. If the imbalance is negative, the planet grows cooler.

Hansen's team concluded that Earth has absorbed more than half a watt more solar energy per square meter than it let off throughout the six year study period. The calculated value of the imbalance (0.58 watts of excess energy per square meter) is more than twice as much as the reduction in the amount of solar energy supplied to the planet between maximum and minimum solar activity (0.25 watts per square meter).

"The fact that we still see a positive imbalance despite the prolonged solar minimum isn't a surprise given what we've learned about the climate system, but it's worth noting because this provides unequivocal evidence that the sun is not the dominant driver of global warming," Hansen said.

According to calculations conducted by Hansen and his colleagues, the 0.58 watts per square meter imbalance implies that carbon dioxide levels need to be reduced to about 350 parts per million to restore the energy budget to equilibrium. The most recent measurements show that carbon dioxide levels are currently 392 parts per million and scientists expect that concentration to continue to rise in the future.

Climate scientists have been refining calculations of Earth's energy imbalance for many years, but this newest estimate is an improvement over previous attempts because the scientists had access to better measurements of ocean temperature than researchers have had in the past.

The improved measurements came from free-floating instruments that directly monitor the temperature, pressure and salinity of the upper ocean to a depth of 2,000 meters (6,560 feet). The network of instruments, known collectively as Argo, has grown dramatically in recent years since researchers first began deploying the floats a decade ago. Today, more than 3,400 Argo floats actively take measurements and provide data to the public, mostly within 24 hours.

Hansen's analysis of the information collected by Argo, along with other ground-based and satellite data, show the upper ocean has absorbed 71 percent of the excess energy and the Southern Ocean, where there are few Argo floats, has absorbed 12 percent. The abyssal zone of the ocean, between about 3,000 and 6,000 meters (9,800 and 20,000 feet) below the surface, absorbed five percent, while ice absorbed eight percent and land four percent.

The updated energy imbalance calculation has important implications for climate modeling. Its value, which is slightly lower than previous estimates, suggests that most climate models overestimate how readily heat mixes deeply into the ocean and significantly underestimates the cooling effect of small airborne particles called aerosols, which along with greenhouse gases and solar irradiance are critical factors in energy imbalance calculations.

"Climate models simulate observed changes in global temperatures quite accurately, so if the models mix heat into the deep ocean too aggressively, it follows that they underestimate the magnitude of the aerosol cooling effect," Hansen said.

Aerosols, which can either warm or cool the atmosphere depending on their composition and how they interact with clouds, are thought to have a net cooling effect. But estimates of their overall impact on climate are quite uncertain given how difficult it is to measure the distribution of the particles on a broad scale. The new study suggests that the overall cooling effect from aerosols could be about twice as strong as current climate models suggest, largely because few models account for how the particles affect clouds.

"Unfortunately, aerosols remain poorly measured from space," said Michael Mishchenko, a scientist also based at GISS and the project scientist for Glory, a satellite mission designed to measure aerosols in unprecedented detail that was lost after a launch failure in early 2011. "We must have a much better understanding of the global distribution of detailed aerosol properties in order to perfect calculations of Earth's energy imbalance," said Mishchenko.

Saturday 25 February 2012

Scientists see "sloshing" galaxy cluster

A Naval Research Laboratory scientist is part of a team that has recently discovered that vast clouds of hot gas are "sloshing" in Abell 2052, a galaxy cluster located about 480 million light years from Earth. The scientists are studying the hot (30 million degree) gas using X-ray data from NASA's Chandra X-ray Observatory and optical data from the Very Large Telescope to see the galaxies.

"The X-ray images were amazing. We were able to see gas sloshing like liquid in a glass" explains NRL's Dr. Tracy Clarke. "Of course this would be one enormous glass since we see the gas sloshing over a region of nearly a million light years across!"

The Chandra data reveal the huge spiral structure in the hot gas around the outside of the image. Zooming in on the cluster reveals "cavities" or "bubbles" surrounding the central giant elliptical galaxy. The spiral began when a small cluster of galaxies collided off-center with a larger one positioned around that central galaxy.

The gravitational attraction of the smaller cluster drew the hot gas out of the central cluster toward the smaller cluster. Once the smaller cluster passed by the central cluster core, the gas movement reversed and it was pulled back toward the center of the main cluster. The hot cluster gas overshot the cluster center, creating the "sloshing" effect that is like the sloshing that occurs when a glass holding a liquid is quickly jerked sideways. In the cluster, gravity pulls back on the gas cloud, creating the spiral pattern.

For scientists, the observation of the "sloshing" motion in Abell 2052 is important for two reasons. First, the "sloshing" helps to move some of the cooler, dense gas in the center of the core farther away from the core. This cooler gas is only about 10 million degrees, as compared to the average temperature of 30 million degrees. This movement reduces the amount of cooling in the cluster core and could limit the amount of new stars being formed in the central galaxy. The "sloshing" movement in Abell 2052 also helps redistribute heavy elements like iron and oxygen, which are created out of supernova explosions. These heavy elements are an important part of the make-up of future stars and planets. The fact that Chandra's observation of Abell 2052 lasted more than a week was critical in providing scientists with the details detected in this image.

Besides the large-scale spiral feature, the Chandra observations also allowed scientists to see details in the center of the cluster related to outbursts from the supermassive black hole. The data reveal bubbles resulting from material blasted away from the black hole which are surrounded by dense, bright, cool rims. In the same way that the "sloshing" helps to reduce the cooling of the gas at the core of the cluster, the bubble activity has the same effect, limiting the growth of the galaxy and its supermassive black hole.

Thursday 23 February 2012

Why pure quantum dots and nanorods shine brighter

To the lengthy list of serendipitous discoveries -- gravity, penicillin, the New World -- add this: Scientists with the U.S. Department of Energy (DOE)'s Lawrence Berkeley National Laboratory (Berkeley Lab) have discovered why a promising technique for making quantum dots and nanorods has so far been a disappointment. Better still, they've also discovered how to correct the problem.

A team of researchers led by chemist Paul Alivisatos, director of Berkeley Lab, and Prashant Jain, a chemist now with the University of Illinois, has discovered why nanocrystals made from multiple components in solution via the exchange of cations (positive ions) have been poor light emitters. The problem, they found, stems from impurities in the final product. The team also demonstrated that these impurities can be removed through heat.

"By heating these nanocrystals to 100 degrees Celsius, we were able to remove the impurities and increase their luminescence by 400-fold within 30 hours," says Jain, a member of Alivisatos' research group when this work was done. "When the impurities were removed the optoelectronic properties of nanocrystals made through cation-exchange were comparable in quality to dots and nanorods conventionally synthesized."

Says Alivisatos, "With our new findings, the cation-exchange technique really becomes a method that can be widely used to make novel high optoelectronic grade nanocrystals."

Jain is the lead author and Alivisatos the corresponding author of a paper describing this work in the journal Angewandte Chemie titled "Highly Luminescent Nanocrystals From Removal of Impurity Atoms Residual From Ion Exchange Synthesis." Other authors were Brandon Beberwyck, Lam-Kiu Fong and Mark Polking.

Quantum dots and nanorods are light-emitting semiconductor nanocrystals that have a broad range of applications, including bio-imaging, solar energy and display screen technologies. Typically, these nanocrystals are synthesized from colloids -- particles suspended in solution. As an alternative, Alivisatos and his research group developed a new solution-based synthesis technique in which nanocrystals are chemically transformed by exchanging or replacing all of the cations in the crystal lattice with another type of cation. This cation-exchange technique makes it possible to produce new types of core/shell nanocrystals that are inaccessible through conventional synthesis. Core/shell nanocrystals are heterostructures in which one type of semiconductor is enclosed within another, for example, a cadmium selenide (CdSe) core and a cadmium sulfide (CdS) shell.

"While holding promise for the simple and inexpensive fabrication of multicomponent nanocrystals, the cation-exchange technique has yielded quantum dots and nanorods that perform poorly in optical and electronic devices," says Alivisatos, a world authority on nanocrystal synthesis who holds a joint appointment with the University of California (UC) Berkeley, where he is the Larry and Diane Bock professor of Nanotechnology.

As Jain tells the story, he was in the process of disposing of CdSe/CdS nanocrystals in solution that were six months old when out of habit he tested the nanocrystals under ultraviolet light. To his surprise he observed significant luminescence. Subsequent spectral measurements and comparing the new data to the old showed that the luminescence of the nanocrystals had increased by at least sevenfold.

"It was an accidental finding and very exciting," Jain says, "but since no one wants to wait six months for their samples to become high quality I decided to heat the nanocrystals to speed up whatever process was causing their luminescence to increase."

Jain and the team suspected and subsequent study confirmed that impurities -- original cations that end up being left behind in the crystal lattice during the exchange process -- were the culprit.

"Even a few cation impurities in a nanocrystal are enough to be effective at trapping useful, energetic charge-carriers," Jain says. "In most quantum dots or nanorods, charge-carriers are delocalized over the entire nanocrystal, making it easy for them to find impurities, no matter how few there might be, within the nanocrystal. By heating the solution to remove these impurities and shut off this impurity-mediated trapping, we give the charge-carriers enough time to radiatively combine and thereby boost luminescence."

Since charge-carriers are also instrumental in electronic transport, photovoltaic performance, and photocatalytic processes, Jain says that shutting off impurity-mediated trapping should also boost these optoelectronic properties in nanocrystals synthesized via the cation-exchange technique.

Tuesday 21 February 2012

New ideas sharpen focus for greener aircraft

Leaner, greener flying machines for the year 2025 are on the drawing boards of three industry teams under contract to the NASA Aeronautics Research Mission Directorate's Environmentally Responsible Aviation Project.

Teams from The Boeing Company in Huntington Beach, Calif., Lockheed Martin in Palmdale, Calif., and Northrop Grumman in El Segundo, Calif., have spent the last year studying how to meet NASA goals to develop technology that would allow future aircraft to burn 50 percent less fuel than aircraft that entered service in 1998 (the baseline for the study), with 75 percent fewer harmful emissions; and to shrink the size of geographic areas affected by objectionable airport noise by 83 percent.

"The real challenge is we want to accomplish all these things simultaneously," said ERA project manager Fay Collier. "It's never been done before. We looked at some very difficult metrics and tried to push all those metrics down at the same time."

So NASA put that challenge to industry -- awarding a little less than $11 million to the three teams to assess what kinds of aircraft designs and technologies could help meet the goals. The companies have just given NASA their results.

"We'll be digesting the three studies and we'll be looking into what to do next," said Collier.

Boeing's advanced vehicle concept centers around the company's now familiar blended wing body design as seen in the sub-scale remotely piloted X-48, which has been wind tunnel tested at NASA's Langley Research Center and flown at NASA's Dryden Flight Research Center. One thing that makes this concept different from current airplanes is the placement of its Pratt & Whitney geared turbofan engines. The engines are on top of the plane's back end, flanked by two vertical tails to shield people on the ground from engine noise. The aircraft also would feature an advanced lightweight, damage tolerant, composite structure; technologies for reducing airframe noise; advanced flight controls; hybrid laminar flow control, which means surfaces designed to reduce drag; and long-span wings which improve fuel efficiency.

Lockheed Martin took an entirely different approach. Its engineers proposed a box wing design, in which a front wing mounted on the lower belly of the plane is joined at the tips to an aft wing mounted on top of the plane. The company has studied the box wing concept for three decades, but has been waiting for lightweight composite materials, landing gear technologies, hybrid laminar flow and other tools to make it a viable configuration. Lockheed's proposal combines the unique design with a Rolls Royce Liberty Works Ultra Fan Engine. This engine has a bypass ratio that is approximately five times greater than current engines, pushing the limits of turbofan technology.

Northrop Grumman chose to embrace a little of its company's history, going back to the 1930s and '40s, with its advanced vehicle concept. Its design is a flying wing, championed by Northrop founder Jack Northrop, and reminiscent of its B-2 aircraft. Four high-bypass engines, provided by Rolls Royce and embedded in the upper surface of the aerodynamically efficient wing would provide noise shielding. The company's expertise in building planes without the benefit of a stabilizing tail would be transferred to the commercial airline market. The Northrop proposal also incorporates advanced composite materials and engine and swept wing laminar flow control technologies.

What the studies revealed is that NASA's goals to reduce fuel consumption, emissions and noise are indeed challenging. The preliminary designs all met the pollution goal of eliminating landing and takeoff emissions of nitrogen oxides by 50 percent. All still have a little way to go to meet the other two challenges. All the designs were very close to a 50-percent fuel burn reduction, but noise reduction capabilities varied.

"All of the teams have done really great work during this conceptual design study," say Mark Mangelsdorf, ERA Project chief engineer. "Their results make me excited about how interesting and different the airplanes on the airport ramp could look in 20 years. Another great result of the study is that they have really helped us focus where to invest our research dollars over the next few years," he said.

NASA's ERA project officials say they believe all the goals can be met if small gains in noise and fuel consumption reduction can be achieved in addition to those projected in the industry studies. The results shed light on the technology and design hurdles airline manufacturers face in trying to design lean, green flying machines and will help guide NASA's environmentally responsible aviation investment strategy for the second half of its six-year project.

Sunday 19 February 2012

Reversing the problem clarifies molecular structure

Optical techniques enable us to examine single molecules, but do we really understand what we are seeing? After all, the fuzziness caused by effects such as light interference makes these images very difficult to interpret. Researchers at the University of Twente's MESA+ Institute for Nanotechnology adopted a "reverse" approach to spectroscopy which cleaned up images by eliminating background noise.

The researchers presented their findings in Physical Review Letters.

Rather than starting with the laser beam, the trick is to take the molecule you are studying as the starting point. This radical "reversal" led to a relatively simple modification of conventional CARS spectroscopy, which delivered better images. CARS was already a powerful technique which used lasers to visualize molecules for such purposes as food testing and medical imaging. One advantage is that no fluorescent labels are needed to make the molecules visible. However, background noise complicates the task of interpreting the resultant images. This new approach eliminates such noise completely, leaving only the "real" image. More information than ever before, such as accurate details of the substance's concentration, can be obtained using this technique. It is easier to detect the signature of the molecule in question.

Energy

The key to side-stepping the overwhelming complexity involved lay in Prof. Shaul Mukamel's exhortation to just "Look at the molecule!" (the professor, who holds a post at the University of California, collaborated on the present publication). So don't focus on the way that light interacts with the molecule, as this makes it very difficult -- even impossible -- to "separate the wheat from the chaff" and reveal the real image. Instead, start by examining the energy levels inside the molecule. Previous work, based on Prof. Mukamel's exhortation, has mainly led to the development of new theories. The University of Twente researchers have now translated this theory into the new technique of Vibrational Molecular Interferometry, which will vastly expand the uses of CARS and other techniques.

This study was conducted in Prof. Jennifer Herek's Optical Sciences group. The research group is part of the MESA+ Institute for Nanotechnology of the University of Twente. The study was funded in part by the Foundation for Fundamental Research on Matter (FOM), and partly from the VICI grant previously awarded to Jennifer Herek by the Netherlands Organisation for Scientific Research (NWO).

The publication, entitled "Background-free nonlinear microspectroscopy with vibrational molecular interferometry," by Erik Garbacik, Jeroen Korterik, Cees Otto, Shaul Mukamel, Jennifer Herek and Herman Offerhaus, was published on 16 December, in the online edition of Physical Review Letters.

Friday 17 February 2012

Kepler team finds 11 new solar systems

NASA's planet-hunting Kepler space telescope has found 11 new planetary systems, including one with five planets all orbiting closer to their parent star than Mercury circles the Sun.

The discoveries boost the list of confirmed extra-solar planets to 729, including 60 credited to the Kepler team.

The telescope, launched in space in March 2009, can detect slight but regular dips in the amount of light coming from stars. Scientists can then determine if the changes are caused by orbiting planets passing by, relative to Kepler's view.

Kepler scientists have another 2300 candidate planets awaiting additional confirmation.

None of the newly discovered planetary systems are like our solar system, though Kepler-33, a star that is older and bigger than the Sun, comes close in terms of sheer numbers. It has five planets, compared to our solar system's eight, but the quintet all fly closer to their parent star than Mercury orbits the Sun.

The planets range in size from about 1.5 times the diameter of Earth to five times Earth's diameter. Scientists have not yet determined if any are solid rocky bodies like Earth, Venus, Mars and Mercury or if they are filled with gas like Jupiter, Saturn, Uranus and Neptune.

The Kepler team previously found one star with six confirmed planets and a second system with five planets, says planetary scientist Jack Lissauer, with NASA's Ames Research Center in California.

Nine of the new systems contain two planets and one has three, bringing the total number of newly discovered planets to 26. All are closer to their host stars than Venus is to the Sun.

"This has tripled the number of stars which we know have more than one transiting planet, so that's the big deal here," says Lissauer.

"We're starting to think in terms of planetary systems as opposed to just planets: Do they all tend to have similar sizes? What's the spacing? Is the solar system unusual in those regards?" he says.

Kepler is monitoring more than 150,000 stars in the constellations Cygnus and Lyra.

The research is published in four different papers in Astrophysical Journal and the Monthly Notices of the Royal Astronomical Society.

Wednesday 15 February 2012

Graphene: Supermaterial goes superpermable

Wonder material graphene has revealed another of its extraordinary properties -- University of Manchester researchers have found that it is superpermeable with respect to water.

Graphene is one of the wonders of the science world, with the potential to create foldaway mobile phones, wallpaper-thin lighting panels and the next generation of aircraft. The new finding at the University of Manchester gives graphene's potential a most surprising dimension -- graphene can also be used for distilling alcohol.

In a report published in Science, a team led by Professor Sir Andre Geim shows that graphene-based membranes are impermeable to all gases and liquids (vacuum-tight). However, water evaporates through them as quickly as if the membranes were not there at all.

This newly-found property can now be added to the already long list of superlatives describing graphene. It is the thinnest known material in the universe and the strongest ever measured. It conducts electricity and heat better than any other material. It is the stiffest one too and, at the same time, it is the most ductile. Demonstrating its remarkable properties won University of Manchester academics the Nobel Prize in Physics in 2010.

Now the University of Manchester scientists have studied membranes from a chemical derivative of graphene called graphene oxide. Graphene oxide is the same graphene sheet but it is randomly covered with other molecules such as hydroxyl groups OH-. Graphene oxide sheets stack on top of each other and form a laminate.

The researchers prepared such laminates that were hundreds times thinner than a human hair but remained strong, flexible and were easy to handle.

When a metal container was sealed with such a film, even the most sensitive equipment was unable to detect air or any other gas, including helium, to leak through.

It came as a complete surprise that, when the researchers tried the same with ordinary water, they found that it evaporates without noticing the graphene seal. Water molecules diffused through the graphene-oxide membranes with such a great speed that the evaporation rate was the same independently whether the container was sealed or completely open.

Dr Rahul Nair, who was leading the experimental work, offers the following explanation: "Graphene oxide sheets arrange in such a way that between them there is room for exactly one layer of water molecules. They arrange themselves in one molecule thick sheets of ice which slide along the graphene surface with practically no friction.

"If another atom or molecule tries the same trick, it finds that graphene capillaries either shrink in low humidity or get clogged with water molecules."

"Helium gas is hard to stop. It slowly leaks even through a millimetre -thick window glass but our ultra-thin films completely block it. At the same time, water evaporates through them unimpeded. Materials cannot behave any stranger," comments Professor Geim. "You cannot help wondering what else graphene has in store for us."

"This unique property can be used in situations where one needs to remove water from a mixture or a container, while keeping in all the other ingredients," says Dr Irina Grigorieva who also participated in the research.

"Just for a laugh, we sealed a bottle of vodka with our membranes and found that the distilled solution became stronger and stronger with time. Neither of us drinks vodka but it was great fun to do the experiment," adds Dr Nair.

The Manchester researchers report this experiment in their Science paper, too, but they say they do not envisage use of graphene in distilleries, nor offer any immediate ideas for applications.

However, Professor Geim adds 'The properties are so unusual that it is hard to imagine that they cannot find some use in the design of filtration, separation or barrier membranes and for selective removal of water'.

Monday 13 February 2012

Electron freedom could spark new computing

A new kind of quantum computing could now be possible given the latest discovery about the way electrons interact.

Physicist Dr Giuseppe Tettamanzi, from the University of New South Wales, and colleagues, report their findings this week in the journal Physical Review Letters.

"Our work hints that the orbital degree of freedom can be used for quantum logic, as in quantum computation, and there may be other applications," says Tettamanzi, whose paper is available on the pre-press website arXiv.org.

Classical computers use millions of transistors on silicon chips that flip on and off, creating a series of 1's and 0's, according to whether current flows through them or not.

The aim of quantum computing is to use the individual properties of electrons to increase the power of data processing.

To date, quantum computing has sought to exploit the charge of electrons, or their spin (which can be either up or down), to create 1's and 0's.

But Tettamanzi and team have now shown that another property of electrons could instead be used to create 1's and 0's.

This property is the electron's so-called 'orbital degree of freedom', which describes its location around the nucleus.

The researchers have demonstrated that interaction between electrons with different orbital degrees of freedom can open a new transport channel in a silicon-based transistor.

While this so-called "Kondo effect" in itself is not obviously useful, the fact that it can be caused by orbital degree of freedom is exciting.

Previously the effect was only known to be caused by electron spin, which suggests orbital degree of freedom could be used, like spin, to develop quantum computers.

Tettamanzi and colleagues also applied a powerful magnetic field (10,000 times higher than Earth's magnetic field) and observed a mathematical symmetry in the system, which is something normally only seen in very expensive experiments, such as those carried out at CERN.

Most importantly, the researchers used a commonly-available silicon-based transistor, which suggests any quantum based computer using orbital degrees of freedom will be commercially viable.

"It's an everyday system, not a very complex system, something that we get from a factory that's made silicon devices for industrial uses," says Tettamanzi.

"There are already billions of dollars of investment in silicon facilities, so if you do something with silicon it will be much easier for it to become an everyday device, because there is already all the infrastructure available."

Saturday 11 February 2012

Flaky graphene makes reliable chemical sensors

Scientists from the University of Illinois at Urbana-Champaign and the company Dioxide Materials have demonstrated that randomly stacked graphene flakes can make an effective chemical sensor.

The researchers created the one-atom-thick carbon lattice flakes by placing bulk graphite in a solution and bombarding it with ultrasonic waves that broke off thin sheets. The researchers then filtered the solution to produce a graphene film, composed of a haphazard arrangement of stacked flakes, that they used as the top layer of a chemical sensor. When the graphene was exposed to test chemicals that altered the surface chemistry of the film, the subsequent movement of electrons through the film produced an electrical signal that flagged the presence of the chemical.

The researchers experimented by adjusting the volume of the filtered solution to make thicker or thinner films. They found that thin films of randomly stacked graphene could more reliably detect trace amounts of test chemicals than previously designed sensors made from carbon nanotubes or graphene crystals.

The results are accepted for publication in the AIP's journal Applied Physics Letters.

The researchers theorize that the improved sensitivity is due to the fact that defects in the carbon-lattice structure near the edge of the graphene flakes allow electrons to easily "hop" through the film.

Thursday 9 February 2012

NASA sees repeating La Niña hitting it's peak

La Niña, "the diva of drought," is peaking, increasing the odds that the Pacific Northwest will have more stormy weather this winter and spring, while the southwestern and southern United States will be dry.

Sea surface height data from NASA's Jason-1 and -2 satellites show that the milder repeat of last year's strong La Niña has recently intensified, as seen in the latest Jason-2 image of the Pacific Ocean.

The image is based on the average of 10 days of data centered on Jan. 8, 2012. It depicts places where the Pacific sea surface height is higher than normal (due to warm water) as yellow and red, while places where the sea surface is lower than normal (due to cool water) are shown in blues and purples. Green indicates near-normal conditions. The height of the sea surface over a given area is an indicator of ocean temperature and other factors that influence climate.

This is the second consecutive year that the Jason altimetric satellites have measured lower-than-normal sea surface heights in the equatorial Pacific and unusually high sea surface heights in the western Pacific.

"Conditions are ripe for a stormy, wet winter in the Pacific Northwest and a dry, relatively rainless winter in Southern California, the Southwest and the southern tier of the United States," says climatologist Bill Patzert of JPL. "After more than a decade of mostly dry years on the Colorado River watershed and in the American Southwest, and only two normal rain years in the past six years in Southern California, low water supplies are lurking. This La Niña could deepen the drought in the already parched Southwest and could also worsen conditions that have fueled recent deadly wildfires."

NASA will continue to monitor this latest La Niña to see whether it has reached its expected winter peak or continues to strengthen.

A repeat of La Niña ocean conditions from one year to the next is not uncommon: repeating La Niñas occurred most recently in 1973-74-75, 1998-99-2000 and in 2007-08-09. Repeating La Niñas most often follow an El Niño episode and are essentially the opposite of El Niño conditions. During a La Niña episode, trade winds are stronger than normal, and the cold water that normally exists along the coast of South America extends to the central equatorial Pacific.

La Niña episodes change global weather patterns and are associated with less moisture in the air over cooler ocean waters. This results in less rain along the coasts of North and South America and along the equator, and more rain in the far Western Pacific.

The comings and goings of El Niño and La Niña are part of a long-term, evolving state of global climate, for which measurements of sea surface height are a key indicator. Jason-1 is a joint effort between NASA and the French Space Agency, Centre National d'Études Spatiales (CNES). Jason-2 is a joint effort between NASA, the National Oceanic and Atmospheric Administration, CNES and the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT). JPL manages the U.S. portion of both missions for NASA's Science Mission Directorate, Washington, D.C.

For more on how La Niña and other climate phenomena are affecting weather in the United States this year, see: http://science.nasa.gov/science-news/science-at-nasa/2012/17jan_missingsnow/ .

For more information on NASA's ocean surface topography missions, visit: http://sealevel.jpl.nasa.gov/

Tuesday 7 February 2012

Flexible adult stem cells, right there in zour eye

In the future, patients in need of perfectly matched neural stem cells may not need to look any further than their own eyes. Researchers reporting in the January issue of Cell Stem Cell, a Cell Press publication, have identified adult stem cells of the central nervous system in a single layer of cells at the back of the eye.

That cell layer, known as the retinal pigment epithelium (RPE), underlies and supports photoreceptors in the light-sensitive retina. Without it, photoreceptors and vision are lost. The new study shows that the RPE also harbors self-renewing stem cells that can wake up to produce actively growing cultures when placed under the right conditions. They can also be coaxed into forming other cell types.

"You can get these cells from a 99-year-old," said Sally Temple of the Neural Stem Cell Institute in Rensselaer, New York. "These cells are laid down in the embryo and can remain dormant for 100 years. Yet you can pull them out and put them in culture and they begin dividing. It is kind of mind boggling."

Temple's group got the RPE-derived stem cells they describe from the eyes of donors in the hours immediately after their deaths. But the cells can also be isolated from the fluid that surrounds the retina at the back of the eye, which means they are accessible in living people as well.

"You can literally go in and poke a needle in the eye and get these cells from the subretinal space," she says. "It sounds awful, but retinal surgeons do it every day." By comparison, access to most other neural stem cell populations would require major surgery.

Temple said they were curious about the proliferative potential of the RPE given that the tissue is known to be capable of regenerating entire retinas in salamanders. But that plasticity in adulthood had seemed to be lost in mice and chicks. Still, "given the evolutionary evidence, we thought it was worth revisiting," she said.

They placed RPE tissue taken from 22-year-old to 99-year-old cadavers into many culture conditions to see what they could make the cells do. They found one set of conditions that got the cells dividing. Not all of the RPE cells have this regenerative potential, but perhaps 10 percent of them do.

Further work showed that the cells are multipotent, which means that they can form different cell types, though the researchers admit there is more to do to fully explore the cells' differentiation capacity.

There are other implications as well. For example, these cells may explain diseases in which other tissue types show up in the eye. Their presence also suggests that there might be some way to stimulate controlled repair of the eye in the millions of people who suffer from age-related macular degeneration.

"I think it might be possible," Temple said.

Sunday 5 February 2012

Most distant dwarf galaxy detected

Scientists have long struggled to detect the dim dwarf galaxies that orbit our own galaxy. So it came as a surprise on Jan. 18 when a team of astronomers using Keck II telescope's adaptive optics has announced the discovery of a dwarf galaxy halfway across the universe.

The new dwarf galaxy found by MIT's Dr. Simona Vegetti and colleagues is a satellite of an elliptical galaxy almost 10 billion light-years away from Earth. The team detected it by studying how the massive elliptical galaxy, called JVAS B1938 + 666, serves as a gravitational lens for light from an even more distant galaxy directly behind it. Their discovery was published in the Jan. 18 online edition of the journal Nature.

Like all supermassive elliptical galaxies, JVAS B1938 + 666's gravity can deflect light passing by it. Often the light from a background galaxy gets deformed into an arc around the lens galaxy, and sometimes what's called an Einstein ring. In this case, the ring is formed mainly by two lensed images of the background galaxy. The size, shape and brightness of the Einstein ring depends on the distribution of mass throughout the foreground lensing galaxy.

Vegetti and her team obtained extra sharp near-infrared image of JVAS B1938 + 666 by using the 10-meter Keck II telescope and its adaptive optics system, which corrects for the blurring effects of Earth's atmosphere, and provides stunningly sharp images. With these data, they neatly determined the mass distribution of JVAS B1938 + 666 as well as the shape and brightness of the background galaxy.

The researchers used a sophisticated numerical technique to derive a model of the lens galaxy's mass, as well as to map any excess lens mass that could not be accounted for by the galaxy. What they found was an excess mass near the Einstein ring that they attributed to the presence of a satellite, or "dwarf," galaxy. Vegetti's team also used a separate analytical model to test the detected excess mass. They found that a satellite galaxy is indeed required to explain the data.

"This satellite galaxy is exciting because it was detected in the excess-mass map despite its low mass," commented Robert Schmidt of the Center for Astronomy at Heidelberg University, in a related Nature article. "A natural question to ask is whether the satellite galaxy can be observed directly rather than by its gravitational effect on the shape of a background object. With current instrumentation, the answer is no. The object is simply too distant to be imaged directly. But the message here is that it is possible to spot these elusive objects around distant lens galaxies without knowing where to look for them."

Galaxies like our own are believed to form over billions of years through the merging of many smaller galaxies. So it's expected that there should be many smaller dwarf galaxies buzzing around the Milky Way. However, very few of these tiny relic galaxies have been observed which has led astronomers to conclude that many of them must have very few stars or possibly may be made almost exclusively of dark matter.

Scientists theorize the existence of dark matter to explain observations that suggest there is far more mass in the universe than can be seen. However, because the particles that make up dark matter do not absorb or emit light, they have so far proven impossible to detect and identify. Computer modeling suggests that the Milky Way should have about 10,000 satellite dwarf galaxies, but only 30 have been observed.

"It could be that many of the satellite galaxies are made of dark matter, making them elusive to detect, or there may be a problem with the way we think galaxies form," says Vegetti.

In the new study, Vegetti worked with Prof. Leon Koopmans of the University of Groningen, Netherlands; Dr. David Lagattuta and Prof. Christopher Fassnacht of the University of California at Davis; Dr. Matthew Auger of the University of California at Santa Barbara; and Dr. John McKean of the Netherlands Institute for Radio Astronomy.

"The existence of this low-mass dark galaxy is just within the bounds we expect if the Universe is composed of dark matter which has a low temperature. However, further dark satellites will need to be found to confirm this conclusion," says Vegetti.

Friday 3 February 2012

PHYSICISTS DEVELOP FIRST CONCLUSIVE TEST TO BETTER UNDERSTANDING HIGH-ENERGY PARTICLES CORRELATIONS

Researchers have devised a proposal for the first conclusive experimental test of a phenomenon known as 'Bell's nonlocality.' This test is designed to reveal correlations that are stronger than any classical correlations, and do so between high-energy particles that do not consist of ordinary matter and light. These results are relevant to the so-called 'CP violation' principle, which is used to explain the dominance of matter over antimatter.

These findings by Beatrix Hiesmayr, a theoretical physicist at the University of Vienna, and her colleagues, a team of quantum information theory specialists, particle physicists and nuclear physicists, have been published in The European Physical Journal C.

According to the famous Einstein-Podolsky-Rosen Gedanken-Experiment, two particles that are measured independently obey the principle of locality, implying that an external influence on the first particle, such as measurement, has no direct influence on the second -- in other words there is no "spooky action at distance," as Einstein would likely have described it.

In an experimental setup, however, measurement results for one particle revealed a correlated measurement result for the other particle. Initially, these correlations could only be explained by referring to hidden parameters. In 1964, John Bell found that so-called local realistic hidden parameter theories imply that the relations between these correlations could be experimentally tested through so-called Bell tests. Since then many experiments have proven that local, realistic hidden parameters cannot be used as an explanation for these correlations.

In this study, the authors have succeeded in devising a new Bell test, taking into account the decay property of high-energy particles systems, called kaon-antikaon systems. This procedure ensures that the test is conclusive -- a goal that has never before been achieved -- and simultaneously guarantees its experimental testability. Experimental testing requires equipment such as the KLOE detector at the accelerator facility DAPHNE in Italy.

Revealing "spooky action at distance" for kaon-antikaon pairs has fundamental implications for our understanding of such particles' correlations and could ultimately allow us to determine whether symmetries in particle physics and manifestations of particles correlations are linked.

Thursday 2 February 2012

What's taking ET so long to find us

Mathematically speaking, ET should have found us by now - if he exists - so we're being consciously avoided for some reason, a new study concludes.

"We're either alone, or they're out there and leave us alone," says Florida Gulf Coast University mathematician Thomas Hair.

Hair, who presented his research at the Mathematical Association of America in Boston earlier this month, based his approximation of what he considered to be extremely conservative estimates for how long it would take a society to muster up the resources and technological know-how to leave its home world and travel to another star.

Even at the relatively sedate pace of 1 per cent of light-speed, the aliens would arrive at their nearest neighbour star in about 500 years.

Figure another 500 years to build new ships, set out again, and so on and so on, and the calculations show that civilisations starting out from the oldest stars in our galaxy would have had epochs of time to reach us by now. So where are they?

"They've either passed us by, or they stay around their home star systems and contemplate their navels," says Hair.

There could be several reasons why we're not listed in the intergalactic version of Trip Advisor or Lonely Planet. Perhaps most important is that we don't have anything aliens need.

"Any ancient civilisation is probably not biological. They don't need a place like Earth. They don't need to come here and steal our water. There's plenty of it out in the outer solar system where the gravity is not so great and they can just take all they want," says Hair.

Or perhaps modern-day extraterrestrials are following routes laid out long ago, all of which bypass Earth, he adds.
Giving ourselves away

Whatever the reason we're being ignored, there is no chance ET, if he or she exists, does not know we are here, says Hair, pointing to telescopes, such as NASA's Kepler observatory, which can detect planets around other stars.

If humans living on a planet that is roughly 5 billion years old have technology like Kepler, an alien civilisation with another 10 million years of experience under its belt would have advanced much further, Hair maintains.

"I'm sure they'd be able to detect if this planet had life on it. Just the CFCs (chlorofluorocarbons) in our atmosphere would give us away," he says.

CFCs are compounds typically found in refrigerants and aerosol products that release chlorine atoms when exposed to ultraviolet light and erode Earth's ozone layer.
Rare, weird and boring

University of Minnesota physicist Woods Halley, who just published a book about the prospects of extraterrestrial life, says we don't know enough about how life got started on Earth to be able to recognise alien life, even if it were staring us in the face.

"I think there are three options," says Halley. "Life is rare, which I think has a reasonable probability of being correct. Life is weird: every time you run into it, it's extremely different from the last time you saw it. Life is dull, meaning you will find something that looks a lot like life on Earth and our problems (in detecting life) are technical.

"I've come to the view that they're all possible, but the preponderance of evidence most likely fits the first - we are rare," says Halley.

Wednesday 1 February 2012

Worlds smallest magnetic data storage unit

Scientists from IBM and the German Center for Free-Electron Laser Science (CFEL) have built the world's smallest magnetic data storage unit. It uses just twelve atoms per bit, the basic unit of information, and squeezes a whole byte (8 bit) into as few as 96 atoms. A modern hard drive, for comparison, still needs more than half a billion atoms per byte.

The team presented their work in the journal Science on January 13, 2012. CFEL is a joint venture of the research centre Deutsches Elektronen-Synchrotron DESY in Hamburg, the Max-Planck-Society (MPG) and the University of Hamburg. "With CFEL the partners have established an innovative institution on the DESY campus, delivering top-level research across a broad spectrum of disciplines," says DESY research director Edgar Weckert.

The nanometre data storage unit was built atom by atom with the help of a scanning tunneling microscope (STM) at IBM's Almaden Research Center in San Jose, California. The researchers constructed regular patterns of iron atoms, aligning them in rows of six atoms each. Two rows are sufficient to store one bit. A byte correspondingly consists of eight pairs of atom rows. It uses only an area of 4 by 16 nanometres (a nanometre being a millionth of a millimetre). "This corresponds to a storage density that is a hundred times higher compared to a modern hard drive," explains Sebastian Loth of CFEL, lead author of the Science paper.

Data are written into and read out from the nano storage unit with the help of an STM. The pairs of atom rows have two possible magnetic states, representing the two values '0' and '1' of a classical bit. An electric pulse from the STM tip flips the magnetic configuration from one to the other. A weaker pulse allows to read out the configuration, although the nano magnets are currently only stable at a frosty temperature of minus 268 degrees Centigrade (5 Kelvin). "Our work goes far beyond current data storage technology," says Loth. The researchers expect arrays of some 200 atoms to be stable at room temperature. Still it will take some time before atomic magnets can be used in data storage.

For the first time, the researchers have managed to employ a special form of magnetism for data storage purposes, called antiferromagnetism. Different from ferromagnetism, which is used in conventional hard drives, the spins of neighbouring atoms within antiferromagnetic material are oppositely aligned, rendering the material magnetically neutral on a bulk level. This means that antiferromagnetic atom rows can be spaced much more closely without magnetically interfering with each other. Thus, the scientist managed to pack bits only one nanometre apart.

"Looking at the shrinking of electronics components we wanted to know if this can be driven into the realm of single atoms," explains Loth. But instead of shrinking existing components the team chose the opposite approach: "Starting with the smallest thing -- single atoms -- we built data storage devices one atom at a time," says IBM research staff member Andreas Heinrich. The required precision is only mastered by few research groups worldwide.

"We tested how large we have to build our unit to reach the realm of classical physics," explains Loth, who moved from IBM to CFEL four months ago. Twelve atoms emerged as the minimum with the elements used. "Beneath this threshold quantum effects blur the stored information." If these quantum effects can somehow be employed for an even denser data storage is currently a topic of intense research.

With their experiments the team have not only built the smallest magnetic data storage unit ever, but have also created an ideal testbed for the transition from classical to quantum physics. "We have learned to control quantum effects through form and size of the iron atom rows," explains Loth, leader of the Max Planck research group 'dynamics of nanoelectric systems' at CFEL in Hamburg and the Max-Planck-Institute for Solid State Research at Stuttgart, Germany. "We can now use this ability to investigate how quantum mechanics kicks in. What seperates quantum magnets from classical magnets? How does a magnet behave at the frontier between both worlds? These are exciting questions that soon could be answered."

A new CFEL laboratory offering ideal conditions for this research will enable Loth to follow up these questions. "With Sebastian Loth, one of the world's leading scientists in the field of time-resolved scanning tunneling microscopy has joined CFEL," stresses CFEL research coordinator Ralf Köhn. "This perfectly complements our existing expertise for the investigation of the dynamics in atomic and molecular systems."