Wednesday, 29 February 2012

Superfluorescence seen from solid-state material

In a flash, the world changed for Tim Noe -- and for physicists who study what they call many-body problems. The Rice University graduate student was the first to see, in the summer of 2010, proof of a theory that solid-state materials are capable of producing an effect known as superfluorescence.

That can only happen when "many bodies" -- in this case, electron-hole pairs created in a semiconductor -- decide to cooperate.

Noe, a student of Rice physicist Junichiro Kono, and their research team used high-intensity laser pulses, a strong magnetic field and very cold temperatures to create the conditions for superfluorescence in a stack of 15 undoped quantum wells. The wells were made of indium, gallium and arsenic and separated by barriers of gallium-arsenide (GaAs). The researchers' results were reported this week in the journal Nature Physics.

Noe spent weeks at the only facility with the right combination of gear to carry out such an experiment, the National High Magnetic Field Laboratory at Florida State University. There, he placed the device in an ultracold (as low as 5 kelvins) chamber, pumped up the magnetic field (which effectively makes the "many body" particles -- the electron-hole pairs -- more sensitive and controllable) and fired a strong laser pulse at the array.

"When you shine light on a semiconductor with a photon energy larger than the band gap, you can create electrons in the conduction band and holes in the valence band. They become conducting," said Kono, a Rice professor of electrical and computer engineering and in physics and astronomy. "The electrons and holes recombine -- which means they disappear -- and emit light. One electron-hole pair disappears and one photon comes out. This process is called photoluminescence."

The Rice experiment acted just that way, but pumping strong laser light into the layers created a cascade among the quantum wells. "What Tim discovered is that in these extreme conditions, with an intense pulse of light on the order of 100 femtoseconds (quadrillionths of a second), you create many, many electron-hole pairs. Then you wait for hundreds of picoseconds (mere trillionths of a second) and a very strong pulse comes out," Kono said.

In the quantum world, that's a long gap. Noe attributes that "interminable" wait of trillionths of a second to the process going on inside the quantum wells. There, the 8-nanometer-thick layers soaked up energy from the laser as it bored in and created what the researchers called a magneto-plasma, a state consisting of a large number of electron-hole pairs. These initially incoherent pairs suddenly line up with each other.

"We're pumping (light) to where absorption's only occurring in the GaAs layers," Noe said. "Then these electrons and holes fall into the well, and the light hits another GaAs layer and another well, and so on. The stack just increases the amount of light that's absorbed." The electrons and holes undergo many scattering processes that leave them in the wells with no coherence, he said. But as a result of the exchange of photons from spontaneous emission, a large, macroscopic coherence develops.

Like a capacitor in an electrical circuit, the wells become saturated and, as the researchers wrote, "decay abruptly" and release the stored charge as a giant pulse of coherent radiation.

"What's unique about this is the delay time between when we create the population of electron-hole pairs and when the burst happens. Macroscopic coherence builds up spontaneously during this delay," Noe said.

Kono said the basic phenomenon of superfluorescence has been seen for years in molecular and atomic gases but wasn't sought in a solid-state material until recently. The researchers now feel such superfluorescence can be fine-tuned. "Eventually we want to observe the same phenomenon at room temperature, and at much lower magnetic fields, maybe even without a magnetic field," he said.

Even better, Kono said, it may be possible to create superfluorescent pulses with any desired wavelength in solid-state materials, powered by electrical rather than light energy.

The researchers said they expect the paper to draw serious interest from their peers in a variety of disciplines, including condensed matter physics; quantum optics; atomic, molecular and optical physics; semiconductor optoelectronics; quantum information science; and materials science and engineering.

There's much work to be done, Kono said. "There are several puzzles that we don't understand," he said. "One thing is a spectral shift over time: The wavelength of the burst is actually changing as a function of time when it comes out. It's very weird, and that has never been seen."

Noe also observed superfluorescent emission with several distinct peaks in the time domain, another mystery to be investigated.

Monday, 27 February 2012

EARTH'S ENERGY BUDGET REMAINED OUT OF BALANCE DESPITE UNUSUALLY LOW SOLAR ACTIVITY

A new NASA study underscores the fact that greenhouse gases generated by human activity -- not changes in solar activity -- are the primary force driving global warming.

The study offers an updated calculation of Earth's energy imbalance, the difference between the amount of solar energy absorbed by Earth's surface and the amount returned to space as heat. The researchers' calculations show that, despite unusually low solar activity between 2005 and 2010, the planet continued to absorb more energy than it returned to space.

James Hansen, director of NASA's Goddard Institute for Space Studies (GISS) in New York City, led the research. Atmospheric Chemistry and Physics published the study last December.

Total solar irradiance, the amount of energy produced by the sun that reaches the top of each square meter of Earth's atmosphere, typically declines by about a tenth of a percent during cyclical lulls in solar activity caused by shifts in the sun's magnetic field. Usually solar minimums occur about every eleven years and last a year or so, but the most recent minimum persisted more than two years longer than normal, making it the longest minimum recorded during the satellite era.

Pinpointing the magnitude of Earth's energy imbalance is fundamental to climate science because it offers a direct measure of the state of the climate. Energy imbalance calculations also serve as the foundation for projections of future climate change. If the imbalance is positive and more energy enters the system than exits, Earth grows warmer. If the imbalance is negative, the planet grows cooler.

Hansen's team concluded that Earth has absorbed more than half a watt more solar energy per square meter than it let off throughout the six year study period. The calculated value of the imbalance (0.58 watts of excess energy per square meter) is more than twice as much as the reduction in the amount of solar energy supplied to the planet between maximum and minimum solar activity (0.25 watts per square meter).

"The fact that we still see a positive imbalance despite the prolonged solar minimum isn't a surprise given what we've learned about the climate system, but it's worth noting because this provides unequivocal evidence that the sun is not the dominant driver of global warming," Hansen said.

According to calculations conducted by Hansen and his colleagues, the 0.58 watts per square meter imbalance implies that carbon dioxide levels need to be reduced to about 350 parts per million to restore the energy budget to equilibrium. The most recent measurements show that carbon dioxide levels are currently 392 parts per million and scientists expect that concentration to continue to rise in the future.

Climate scientists have been refining calculations of Earth's energy imbalance for many years, but this newest estimate is an improvement over previous attempts because the scientists had access to better measurements of ocean temperature than researchers have had in the past.

The improved measurements came from free-floating instruments that directly monitor the temperature, pressure and salinity of the upper ocean to a depth of 2,000 meters (6,560 feet). The network of instruments, known collectively as Argo, has grown dramatically in recent years since researchers first began deploying the floats a decade ago. Today, more than 3,400 Argo floats actively take measurements and provide data to the public, mostly within 24 hours.

Hansen's analysis of the information collected by Argo, along with other ground-based and satellite data, show the upper ocean has absorbed 71 percent of the excess energy and the Southern Ocean, where there are few Argo floats, has absorbed 12 percent. The abyssal zone of the ocean, between about 3,000 and 6,000 meters (9,800 and 20,000 feet) below the surface, absorbed five percent, while ice absorbed eight percent and land four percent.

The updated energy imbalance calculation has important implications for climate modeling. Its value, which is slightly lower than previous estimates, suggests that most climate models overestimate how readily heat mixes deeply into the ocean and significantly underestimates the cooling effect of small airborne particles called aerosols, which along with greenhouse gases and solar irradiance are critical factors in energy imbalance calculations.

"Climate models simulate observed changes in global temperatures quite accurately, so if the models mix heat into the deep ocean too aggressively, it follows that they underestimate the magnitude of the aerosol cooling effect," Hansen said.

Aerosols, which can either warm or cool the atmosphere depending on their composition and how they interact with clouds, are thought to have a net cooling effect. But estimates of their overall impact on climate are quite uncertain given how difficult it is to measure the distribution of the particles on a broad scale. The new study suggests that the overall cooling effect from aerosols could be about twice as strong as current climate models suggest, largely because few models account for how the particles affect clouds.

"Unfortunately, aerosols remain poorly measured from space," said Michael Mishchenko, a scientist also based at GISS and the project scientist for Glory, a satellite mission designed to measure aerosols in unprecedented detail that was lost after a launch failure in early 2011. "We must have a much better understanding of the global distribution of detailed aerosol properties in order to perfect calculations of Earth's energy imbalance," said Mishchenko.

Saturday, 25 February 2012

Scientists see "sloshing" galaxy cluster

A Naval Research Laboratory scientist is part of a team that has recently discovered that vast clouds of hot gas are "sloshing" in Abell 2052, a galaxy cluster located about 480 million light years from Earth. The scientists are studying the hot (30 million degree) gas using X-ray data from NASA's Chandra X-ray Observatory and optical data from the Very Large Telescope to see the galaxies.

"The X-ray images were amazing. We were able to see gas sloshing like liquid in a glass" explains NRL's Dr. Tracy Clarke. "Of course this would be one enormous glass since we see the gas sloshing over a region of nearly a million light years across!"

The Chandra data reveal the huge spiral structure in the hot gas around the outside of the image. Zooming in on the cluster reveals "cavities" or "bubbles" surrounding the central giant elliptical galaxy. The spiral began when a small cluster of galaxies collided off-center with a larger one positioned around that central galaxy.

The gravitational attraction of the smaller cluster drew the hot gas out of the central cluster toward the smaller cluster. Once the smaller cluster passed by the central cluster core, the gas movement reversed and it was pulled back toward the center of the main cluster. The hot cluster gas overshot the cluster center, creating the "sloshing" effect that is like the sloshing that occurs when a glass holding a liquid is quickly jerked sideways. In the cluster, gravity pulls back on the gas cloud, creating the spiral pattern.

For scientists, the observation of the "sloshing" motion in Abell 2052 is important for two reasons. First, the "sloshing" helps to move some of the cooler, dense gas in the center of the core farther away from the core. This cooler gas is only about 10 million degrees, as compared to the average temperature of 30 million degrees. This movement reduces the amount of cooling in the cluster core and could limit the amount of new stars being formed in the central galaxy. The "sloshing" movement in Abell 2052 also helps redistribute heavy elements like iron and oxygen, which are created out of supernova explosions. These heavy elements are an important part of the make-up of future stars and planets. The fact that Chandra's observation of Abell 2052 lasted more than a week was critical in providing scientists with the details detected in this image.

Besides the large-scale spiral feature, the Chandra observations also allowed scientists to see details in the center of the cluster related to outbursts from the supermassive black hole. The data reveal bubbles resulting from material blasted away from the black hole which are surrounded by dense, bright, cool rims. In the same way that the "sloshing" helps to reduce the cooling of the gas at the core of the cluster, the bubble activity has the same effect, limiting the growth of the galaxy and its supermassive black hole.

Thursday, 23 February 2012

Why pure quantum dots and nanorods shine brighter

To the lengthy list of serendipitous discoveries -- gravity, penicillin, the New World -- add this: Scientists with the U.S. Department of Energy (DOE)'s Lawrence Berkeley National Laboratory (Berkeley Lab) have discovered why a promising technique for making quantum dots and nanorods has so far been a disappointment. Better still, they've also discovered how to correct the problem.

A team of researchers led by chemist Paul Alivisatos, director of Berkeley Lab, and Prashant Jain, a chemist now with the University of Illinois, has discovered why nanocrystals made from multiple components in solution via the exchange of cations (positive ions) have been poor light emitters. The problem, they found, stems from impurities in the final product. The team also demonstrated that these impurities can be removed through heat.

"By heating these nanocrystals to 100 degrees Celsius, we were able to remove the impurities and increase their luminescence by 400-fold within 30 hours," says Jain, a member of Alivisatos' research group when this work was done. "When the impurities were removed the optoelectronic properties of nanocrystals made through cation-exchange were comparable in quality to dots and nanorods conventionally synthesized."

Says Alivisatos, "With our new findings, the cation-exchange technique really becomes a method that can be widely used to make novel high optoelectronic grade nanocrystals."

Jain is the lead author and Alivisatos the corresponding author of a paper describing this work in the journal Angewandte Chemie titled "Highly Luminescent Nanocrystals From Removal of Impurity Atoms Residual From Ion Exchange Synthesis." Other authors were Brandon Beberwyck, Lam-Kiu Fong and Mark Polking.

Quantum dots and nanorods are light-emitting semiconductor nanocrystals that have a broad range of applications, including bio-imaging, solar energy and display screen technologies. Typically, these nanocrystals are synthesized from colloids -- particles suspended in solution. As an alternative, Alivisatos and his research group developed a new solution-based synthesis technique in which nanocrystals are chemically transformed by exchanging or replacing all of the cations in the crystal lattice with another type of cation. This cation-exchange technique makes it possible to produce new types of core/shell nanocrystals that are inaccessible through conventional synthesis. Core/shell nanocrystals are heterostructures in which one type of semiconductor is enclosed within another, for example, a cadmium selenide (CdSe) core and a cadmium sulfide (CdS) shell.

"While holding promise for the simple and inexpensive fabrication of multicomponent nanocrystals, the cation-exchange technique has yielded quantum dots and nanorods that perform poorly in optical and electronic devices," says Alivisatos, a world authority on nanocrystal synthesis who holds a joint appointment with the University of California (UC) Berkeley, where he is the Larry and Diane Bock professor of Nanotechnology.

As Jain tells the story, he was in the process of disposing of CdSe/CdS nanocrystals in solution that were six months old when out of habit he tested the nanocrystals under ultraviolet light. To his surprise he observed significant luminescence. Subsequent spectral measurements and comparing the new data to the old showed that the luminescence of the nanocrystals had increased by at least sevenfold.

"It was an accidental finding and very exciting," Jain says, "but since no one wants to wait six months for their samples to become high quality I decided to heat the nanocrystals to speed up whatever process was causing their luminescence to increase."

Jain and the team suspected and subsequent study confirmed that impurities -- original cations that end up being left behind in the crystal lattice during the exchange process -- were the culprit.

"Even a few cation impurities in a nanocrystal are enough to be effective at trapping useful, energetic charge-carriers," Jain says. "In most quantum dots or nanorods, charge-carriers are delocalized over the entire nanocrystal, making it easy for them to find impurities, no matter how few there might be, within the nanocrystal. By heating the solution to remove these impurities and shut off this impurity-mediated trapping, we give the charge-carriers enough time to radiatively combine and thereby boost luminescence."

Since charge-carriers are also instrumental in electronic transport, photovoltaic performance, and photocatalytic processes, Jain says that shutting off impurity-mediated trapping should also boost these optoelectronic properties in nanocrystals synthesized via the cation-exchange technique.

Tuesday, 21 February 2012

New ideas sharpen focus for greener aircraft

Leaner, greener flying machines for the year 2025 are on the drawing boards of three industry teams under contract to the NASA Aeronautics Research Mission Directorate's Environmentally Responsible Aviation Project.

Teams from The Boeing Company in Huntington Beach, Calif., Lockheed Martin in Palmdale, Calif., and Northrop Grumman in El Segundo, Calif., have spent the last year studying how to meet NASA goals to develop technology that would allow future aircraft to burn 50 percent less fuel than aircraft that entered service in 1998 (the baseline for the study), with 75 percent fewer harmful emissions; and to shrink the size of geographic areas affected by objectionable airport noise by 83 percent.

"The real challenge is we want to accomplish all these things simultaneously," said ERA project manager Fay Collier. "It's never been done before. We looked at some very difficult metrics and tried to push all those metrics down at the same time."

So NASA put that challenge to industry -- awarding a little less than $11 million to the three teams to assess what kinds of aircraft designs and technologies could help meet the goals. The companies have just given NASA their results.

"We'll be digesting the three studies and we'll be looking into what to do next," said Collier.

Boeing's advanced vehicle concept centers around the company's now familiar blended wing body design as seen in the sub-scale remotely piloted X-48, which has been wind tunnel tested at NASA's Langley Research Center and flown at NASA's Dryden Flight Research Center. One thing that makes this concept different from current airplanes is the placement of its Pratt & Whitney geared turbofan engines. The engines are on top of the plane's back end, flanked by two vertical tails to shield people on the ground from engine noise. The aircraft also would feature an advanced lightweight, damage tolerant, composite structure; technologies for reducing airframe noise; advanced flight controls; hybrid laminar flow control, which means surfaces designed to reduce drag; and long-span wings which improve fuel efficiency.

Lockheed Martin took an entirely different approach. Its engineers proposed a box wing design, in which a front wing mounted on the lower belly of the plane is joined at the tips to an aft wing mounted on top of the plane. The company has studied the box wing concept for three decades, but has been waiting for lightweight composite materials, landing gear technologies, hybrid laminar flow and other tools to make it a viable configuration. Lockheed's proposal combines the unique design with a Rolls Royce Liberty Works Ultra Fan Engine. This engine has a bypass ratio that is approximately five times greater than current engines, pushing the limits of turbofan technology.

Northrop Grumman chose to embrace a little of its company's history, going back to the 1930s and '40s, with its advanced vehicle concept. Its design is a flying wing, championed by Northrop founder Jack Northrop, and reminiscent of its B-2 aircraft. Four high-bypass engines, provided by Rolls Royce and embedded in the upper surface of the aerodynamically efficient wing would provide noise shielding. The company's expertise in building planes without the benefit of a stabilizing tail would be transferred to the commercial airline market. The Northrop proposal also incorporates advanced composite materials and engine and swept wing laminar flow control technologies.

What the studies revealed is that NASA's goals to reduce fuel consumption, emissions and noise are indeed challenging. The preliminary designs all met the pollution goal of eliminating landing and takeoff emissions of nitrogen oxides by 50 percent. All still have a little way to go to meet the other two challenges. All the designs were very close to a 50-percent fuel burn reduction, but noise reduction capabilities varied.

"All of the teams have done really great work during this conceptual design study," say Mark Mangelsdorf, ERA Project chief engineer. "Their results make me excited about how interesting and different the airplanes on the airport ramp could look in 20 years. Another great result of the study is that they have really helped us focus where to invest our research dollars over the next few years," he said.

NASA's ERA project officials say they believe all the goals can be met if small gains in noise and fuel consumption reduction can be achieved in addition to those projected in the industry studies. The results shed light on the technology and design hurdles airline manufacturers face in trying to design lean, green flying machines and will help guide NASA's environmentally responsible aviation investment strategy for the second half of its six-year project.

Sunday, 19 February 2012

Reversing the problem clarifies molecular structure

Optical techniques enable us to examine single molecules, but do we really understand what we are seeing? After all, the fuzziness caused by effects such as light interference makes these images very difficult to interpret. Researchers at the University of Twente's MESA+ Institute for Nanotechnology adopted a "reverse" approach to spectroscopy which cleaned up images by eliminating background noise.

The researchers presented their findings in Physical Review Letters.

Rather than starting with the laser beam, the trick is to take the molecule you are studying as the starting point. This radical "reversal" led to a relatively simple modification of conventional CARS spectroscopy, which delivered better images. CARS was already a powerful technique which used lasers to visualize molecules for such purposes as food testing and medical imaging. One advantage is that no fluorescent labels are needed to make the molecules visible. However, background noise complicates the task of interpreting the resultant images. This new approach eliminates such noise completely, leaving only the "real" image. More information than ever before, such as accurate details of the substance's concentration, can be obtained using this technique. It is easier to detect the signature of the molecule in question.

Energy

The key to side-stepping the overwhelming complexity involved lay in Prof. Shaul Mukamel's exhortation to just "Look at the molecule!" (the professor, who holds a post at the University of California, collaborated on the present publication). So don't focus on the way that light interacts with the molecule, as this makes it very difficult -- even impossible -- to "separate the wheat from the chaff" and reveal the real image. Instead, start by examining the energy levels inside the molecule. Previous work, based on Prof. Mukamel's exhortation, has mainly led to the development of new theories. The University of Twente researchers have now translated this theory into the new technique of Vibrational Molecular Interferometry, which will vastly expand the uses of CARS and other techniques.

This study was conducted in Prof. Jennifer Herek's Optical Sciences group. The research group is part of the MESA+ Institute for Nanotechnology of the University of Twente. The study was funded in part by the Foundation for Fundamental Research on Matter (FOM), and partly from the VICI grant previously awarded to Jennifer Herek by the Netherlands Organisation for Scientific Research (NWO).

The publication, entitled "Background-free nonlinear microspectroscopy with vibrational molecular interferometry," by Erik Garbacik, Jeroen Korterik, Cees Otto, Shaul Mukamel, Jennifer Herek and Herman Offerhaus, was published on 16 December, in the online edition of Physical Review Letters.

Friday, 17 February 2012

Kepler team finds 11 new solar systems

NASA's planet-hunting Kepler space telescope has found 11 new planetary systems, including one with five planets all orbiting closer to their parent star than Mercury circles the Sun.

The discoveries boost the list of confirmed extra-solar planets to 729, including 60 credited to the Kepler team.

The telescope, launched in space in March 2009, can detect slight but regular dips in the amount of light coming from stars. Scientists can then determine if the changes are caused by orbiting planets passing by, relative to Kepler's view.

Kepler scientists have another 2300 candidate planets awaiting additional confirmation.

None of the newly discovered planetary systems are like our solar system, though Kepler-33, a star that is older and bigger than the Sun, comes close in terms of sheer numbers. It has five planets, compared to our solar system's eight, but the quintet all fly closer to their parent star than Mercury orbits the Sun.

The planets range in size from about 1.5 times the diameter of Earth to five times Earth's diameter. Scientists have not yet determined if any are solid rocky bodies like Earth, Venus, Mars and Mercury or if they are filled with gas like Jupiter, Saturn, Uranus and Neptune.

The Kepler team previously found one star with six confirmed planets and a second system with five planets, says planetary scientist Jack Lissauer, with NASA's Ames Research Center in California.

Nine of the new systems contain two planets and one has three, bringing the total number of newly discovered planets to 26. All are closer to their host stars than Venus is to the Sun.

"This has tripled the number of stars which we know have more than one transiting planet, so that's the big deal here," says Lissauer.

"We're starting to think in terms of planetary systems as opposed to just planets: Do they all tend to have similar sizes? What's the spacing? Is the solar system unusual in those regards?" he says.

Kepler is monitoring more than 150,000 stars in the constellations Cygnus and Lyra.

The research is published in four different papers in Astrophysical Journal and the Monthly Notices of the Royal Astronomical Society.