Almost universal SERS sensor could change how we sniff out small things

sers

A new almost–universal SERS substrate could be the key to cheaper and easier sensors for drugs, explosives, or anything else (Credit: University of Buffalo)

Identifying fraudulent paintings based on electrochemical data, highlighting cancerous cells in a sea of healthy ones, and identifying different strains of bacteria in samples of food are all examples of surface-enhanced Raman spectroscopy (SERS), a sensor system that has only become more in-demand as our desire for precise, instantaneous information has increased. However, the technology has largely failed commercialization because the chips used are difficult and expensive to create, have limited uses for a particular known substance, and are consumed upon use. Researchers led by a team from the University of Buffalo (UB) aim to change nanoscale sensors with an almost-universal substrate that’s also low-cost, opening up more opportunities for powerful analysis of our environment.

Though SERS is complicated to understand on the surface, it forms a critical part of testing for explosives, identifying toxins in food, and other applications in public health and safety, medicine, and research.

The technique relies on the unique electromagnetic properties of chemical compounds when stimulated with varying wavelengths of laser light and interacting with a surface designed to enhance the response (the “surface” in the name SERS). Each unique compound has a distinct spectral fingerprint and thus an industry researcher can discern between compounds that are invisible to the human eye without having to rely on doping the sample with labeling chemicals or having to possess a large sample.

Yet, currently most surfaces or substrates available on a commercial chip are optimized for only one wavelength of light, meaning scientists working with multiple compounds may need several chips to identify all their samples. This also ignores the ability to identify anonymous samples, which by their nature would require testing on multiple substrates.

The research from the team from UB and Fudan University in China introduces a substrate with a broadband nanostructure that “traps” the wide range of light most often used in SERS analysis, between 450 and 1100 nm.

The surface is composed of a film of thin silver or aluminum acting as a mirror, and a dielectric silica or alumina layer which separates the “mirror” from a layer of randomly applied silver nanoparticles. This construction also avoids expensive lithographic construction techniques.

Researcher Nan Zhang summed up the importance of the design by comparing it to a skeleton key.

“Instead of needing all these different substrates to measure Raman signals excited by different wavelengths, you’ll eventually need just one,” says Zhang. “Just like a skeleton key that opens many doors.”

Perhaps soon these “keys” will be available for airport screening, counterfeit protection, chemical weapon detection and a host of many more purposes requiring flexible, cheap sensors.

“The applications of such a device are far-reaching,” said Kai Liu, a PhD candidate in electrical engineering at UB. “The ability to detect even smaller amounts of chemical and biological molecules could be helpful with biosensors that are used to detect cancer, Malaria, HIV and other illnesses.”

References:http://www.gizmag.com/

A new tool measures the distance between phonon collisions

anewtoolmeas

Today’s computer chips pack billions of tiny transistors onto a plate of silicon within the width of a fingernail. Each transistor, just tens of nanometers wide, acts as a switch that, in concert with others, carries out a computer’s computations. As dense forests of transistors signal back and forth, they give off heat—which can fry the electronics, if a chip gets too hot.
Manufacturers commonly apply a classical diffusion theory to gauge a transistor’s temperature rise in a computer chip. But now an experiment by MIT engineers suggests that this common theory doesn’t hold up at extremely small length scales. The group’s results indicate that the diffusion theory underestimates the temperature rise of nanoscale heat sources, such as a computer chip’s transistors. Such a miscalculation could affect the reliability and performance of chips and other microelectronic devices.
“We verified that when the heat source is very small, you cannot use the diffusion theory to calculate temperature rise of a device. Temperature rise is higher than diffusion prediction, and in microelectronics, you don’t want that to happen,” says Professor Gang Chen, head of the Department of Mechanical Engineering at MIT. “So this might change the way people think about how to model thermal problems in microelectronics.”
The group, including graduate student Lingping Zeng and Institute Professor Mildred Dresselhaus of MIT, Yongjie Hu of the University of California at Los Angeles, and Austin Minnich of Caltech, has published its results this week in the journal Nature Nanotechnology.
Phonon mean free path distribution

Chen and his colleagues came to their conclusion after devising an experiment to measure heat carriers’ “mean free path” distribution in a material. In semiconductors and dielectrics, heat typically flows in the form of phonons—wavelike particles that carry heat through a material and experience various scatterings during their propagation. A phonon’s mean free path is the distance a phonon can carry heat before colliding with another particle; the longer a phonon’s mean free path, the better it is able to carry, or conduct, heat.
As the mean free path can vary from phonon to phonon in a given material—from several nanometers to microns—the material exhibits a mean free path distribution, or range. Chen, the Carl Richard Soderberg Professor in Power Engineering at MIT, reasoned that measuring this distribution would provide a more detailed picture of a material’s heat-carrying capability, enabling researchers to engineer materials, for example, using nanostructures to limit the distance that phonons travel.
The group sought to establish a framework and tool to measure the mean free path distribution in a number of technologically interesting materials. There are two thermal transport regimes: diffusive regime and quasiballistic regime. The former returns the bulk thermal conductivity, which masks the important mean free path distribution. To study phonons’ mean free paths, the researchers realized they would need a small heat source compared with the phonon mean free path to access the quasiballistic regime, as larger heat sources would essentially mask individual phonons’ effects.
Creating nanoscale heat sources was a significant challenge: Lasers can only be focused to a spot the size of the light’s wavelength, about one micron—more than 10 times the length of the mean free path in some phonons. To concentrate the energy of laser light to an even finer area, the team patterned aluminum dots of various sizes, from tens of micrometers down to 30 nanometers, across the surface of silicon, silicon germanium alloy, gallium arsenide, gallium nitride, and sapphire. Each dot absorbs and concentrates a laser’s heat, which then flows through the underlying material as phonons.
In their experiments, Chen and his colleagues used microfabrication to vary the size of the aluminum dots, and measured the decay of a pulsed laser reflected from the material—an indirect measure of the heat propagation in the material. They found that as the size of the heat source becomes smaller, the temperature rise deviates from the diffusion theory.
They interpret that as the metal dots, which are heat sources, become smaller, phonons leaving the dots tend to become “ballistic,” shooting across the underlying material without scattering. In these cases, such phonons do not contribute much to a material’s thermal conductivity. But for much larger heat sources acting on the same material, phonons tend to collide with other phonons and scatter more often. In these cases, the diffusion theory that is currently in use becomes valid.
A detailed transport picture
For each material, the researchers plotted a distribution of mean free paths, reconstructed from the heater-size-dependent thermal conductivity of a material. Overall, they observed the anticipated new picture of heat conduction: While the common, classical diffusion theory is applicable to large heat sources, it fails for small heat sources. By varying the size of heat sources, Chen and his colleagues can map out how far phonons travel between collisions, and how much they contribute to heat conduction.
Zeng says that the group’s experimental setup can be used to better understand, and potentially tune, a material’s thermal conductivity. For example, if an engineer desires a material with certain thermal properties, the mean free path distribution could serve as a blueprint to design specific “scattering centers” within the material—locations that prompt phonon collisions, in turn scattering heat propagation, leading to reduced heat carrying ability. Although such effects are not desirable in keeping a computer chip cool, they are suitable in thermoelectric devices, which convert heat to electricity. For such applications, materials that are electrically conducting but thermally insulating are desired.
“The important thing is, we have a spectroscopy tool to measure the mean free path distribution, and that distribution is important for many technological applications,” Zeng says.

References:http://phys.org/

Entangled photons unlock new super-sensitive characterisation of quantum technology

entangledpho

A new protocol for estimating unknown optical processes, called unitary operations, with precision enhanced by the unique properties of quantum mechanics has been demonstrated by scientists and engineers from the University of Bristol, UK, and the Centre for Quantum Technologies in Singapore.

The work, published in the June issue of Optica, could lead to both dramatically better sensors for medical research and new approaches to benchmark the performance of ultra-powerful quantum computers.
History tells us the ability to measure parameters and sense phenomena with increasing precision leads to dramatic advances in identifying new phenomena in science and improving the performance of technology: famous examples include X-ray imaging, magnetic resonance imaging (MRI), interferometry and the scanning-tunnelling microscope.
Scientists are understanding how to engineer and control quantum systems to vastly expand the limits of measurement and sensing is growing rapidly. This area, known as quantum metrology, promises to open up radically alternative methods to the current state-of-the-art in sensing.
In this new study, the researchers re-directed the sensing power of quantum mechanics back on itself to characterise, with increased precision, unknown quantum processes that can include individual components used to build quantum computers. This ability is becoming more and more important as quantum technologies move closer to real applications.
Dr Xiao-Qi Zhou of Bristol’s School of Physics said: “A really exciting problem is characterizing unknown quantum processes using a technique called quantum process tomography. You can think of this as a problem where a quantum object, maybe a photonic circuit of optics or an atomic system, is locked in a box. We can send quantum states in and we can measure the quantum states that come out. Our challenge is to correctly identify what is in the box. This is a difficult problem in quantum mechanics and it is a highly active area of research because its solution is needed to enable us to test quantum computers as they grow in size and complexity.”
One major shortcoming of quantum process tomography is that precision using standard techniques is limited by a type of noise known as ‘shot noise’. By borrowing techniques from quantum metrology, the researchers were able to demonstrate precision beyond the shot noise limit. They expect their protocol can also be applied to build more sophisticated sensors that identify molecules and chemicals more precisely by observing how they interact with quantum states of light.

Co-author Rebecca Whittaker, a PhD student in Bristol’s Centre for Quantum Photonics said: “The optical process we measured here can be used to manipulate quantum bits of information in a quantum computer but they can also occur in nature. For example, our setup could be used to measure how the polarisation of light is rotated by a sample. We could then infer properties of that sample with better precision.
“Increasing measurement precision is particularly important for probing light-sensitive samples where we want to get as much information as we can before our probe light damages or causes alterations to the sample. We feel this will have a big impact on the tools used in medical research.”
The researchers’ protocol relies on generating multiple photons in an entangled state and this study demonstrates that they can reconstruct rotations which act on the polarisation of light.

References:http://phys.org/

Gadgets powered wirelessly at home with a simple Wi-Fi router

dn27633-1_300

Our homes are a tangled mess of wires and chargers. But that might be about to change. Work is under way to use the Wi-Fi signals that surround us to power our gadgets.

In Seattle, six households have taken part in an experiment in which modified electrical devices were put in their homes along with a Wi-Fi router. Over 24 hours, the devices were powered solely by the router’s signal, which also continued to provide wireless internet access to the home.

How was this possible? The energy of the radio waves the router sent out was converted into direct current voltage with a component called a rectifier, much as solar panels convert light energy into electrical energy. That voltage was then boosted to a useful level by a DC-DC converter (arxiv.org/abs/1505.06815).

The system powered temperature sensors and battery-less low-resolution cameras, and charged standard batteries.

The hard part is getting the router to constantly push out enough energy, says team member Vamsi Talla from the University of Washington in Seattle.

When someone is browsing the web, the Wi-Fi signal is active and can be used to power devices. However, when not browsing the signal goes quiet.

“With Wi-Fi for communications, you only want to transmit when you have data to send,” Talla says. “But for power delivery, you want to transmit something all the time. There’s a clear mismatch.”

To get around this, the team designed software that broadcasts meaningless data across several Wi-Fi channels when no one is using the internet.

Small devices could use this as part of an internet of things, says Ben Potter at the University of Reading, UK. “Where we’re heading is to have more sensors in everything around us,” he says. “Innovations with microchips mean they can run with less power. For that type of application, this is interesting technology.”

The problem is that Wi-Fi is never going to provide a very powerful signal. Wi-Fi is tightly regulated in many countries – the US Federal Communications Commission (FCC), for example, limits the power of a Wi-Fi broadcast to 1 watt. An iPhone charger delivers at least 5 watts – and has no other demands on its output.

One company with a solution is Ossia in Bellevue, Washington. It has a system called Cota that gets around the FCC regulations by designing a wireless hub that transmits waves at a Wi-Fi frequency but doesn’t send a communications signal.

The Cota set-up can produce up to 20 watts, but would only deliver 1 watt to a single phone. CEO Hatem Zeine says that’s enough to charge an iPhone 5 several times over in a single day if it has constant access to the signal.

“Unlike Wi-Fi, our power signal is unmodulated,” says Zeine. “It’s a continuous wave, there’s no message in it.”

A receiver chip on the device being charged tells the hub which of Cota’s thousands of antennas it is receiving signals from. Those antennas alone are kept active and the system is able to ignore other objects in the room, such as a human body.

Eric Woods, an IT infrastructure researcher at consultancy firm Navigant in London, thinks there will be demand for this type of technology for the many sensors that will fill the smart homes and cities of the future.

Sensors powered by Wi-Fi could be used to monitor air quality or the status of systems across a city, says Woods. “Removing the need to think about batteries takes away one of the barriers to the exploitation of those technologies,” he says.

References:http://www.newscientist.com/