More stories

  • in

    Smartphones make consumers prefer unique, tailored products

    Personalized wine lists. Tailored clothing options. Unique experiences just for you.
    The world is awash in products and services that promise to provide custom experiences to every consumer. And it turns out our smartphones are pushing us to unconsciously prefer just these kinds of customized options.
    A new study from the University of Florida has discovered that consumers gravitate toward customized, rare or special products when they are engrossed in their phones. The highly private and personalized feelings we have toward our phones seem to encourage us to express our unique selves more than if we buy products on a larger computer — or borrow some stranger’s phone.
    The findings suggest that companies should — and indeed might already — change what they offer to consumers depending on what device they are using. The smartphone’s activation of a self-expression mindset also likely alters a range of behaviors, such as how people respond to political polls on mobile devices.
    “When you use your phone, your authentic self is being expressed to a greater extent. That affects the options you seek and the attitudes you express,” said Aner Sela, a professor in UF’s Warrington College of Business and one of the authors of the study.
    Sela and his former doctoral student Camilla Song, now an assistant professor at City University of Hong Kong, published their findings Aug. 3 in the Journal of Marketing Research. More

  • in

    Microscopic color converters move small laser-based devices closer to reality

    Lasers are everywhere. Devices that use them transmit information and enable the existence of long-distance communications and the internet; they aid doctors performing surgeries and engineers manufacturing advanced tools and technologies; and day-to-day, we encounter lasers as we scan our groceries and watch DVDs. “In the 60-some years since they were invented, lasers have absolutely transformed our lives,” said Giulio Cerullo, a nonlinear optics researcher at Politecnico di Milano in Italy.
    Today, with the help of new research from Cerullo and collaborators at Columbia University published in Nature Photonics, devices that use lasers are poised to become a whole lot smaller.
    Working in engineer James Schuck’s lab at Columbia, PhD student Xinyi Xu and postdoc Chiara Trovatello studied a 2D material called molybdenum disulfide (MoS2). They characterized how efficiently devices built from stacks of MoS2 less than one micron thick — that’s 100 times thinner than a human hair — convert light frequencies at telecom wavelengths to produce different colors.
    This new research is a first step toward replacing the standard materials used in today’s tunable lasers, which are measured in millimeters and centimeters, said Trovatello, who recently completed her PhD with Cerullo in Milan. “Nonlinear optics is currently a macroscopic world, but we want to make it microscopic,” she said.
    Lasers give off a special kind of coherent light, which means all the photons in the beam share the same frequency and thus, color. Lasers operate only at specific frequencies, but devices often need to be able to deploy different colors of laser light. For example, a green laser pointer is actually produced by an infrared laser that’s converted to a visible color by a macroscopic material. Researchers use nonlinear optical techniques to change the color of laser light, but conventionally used materials need to be relatively thick for color conversion to occur efficiently.
    MoS2 is one of the most studied examples of an emerging class of materials called transition metal dichalcogenides, which can be peeled into atomically thin layers. Single layers of MoS2 can convert light frequencies efficiently, but are actually too thin to be used to build devices. Larger crystals of MoS2, meanwhile, tend to be more stable in a non-color converting form. To fabricate the necessary crystals, known as 3R-MoS2, the team worked with the commercial 2D-material supplier HQ Graphene.
    With 3R-MoS2 in hand, Xu began peeling off samples of varying thickness to test how efficiently they converted the frequency of light. Right away, the results were spectacular. “Rarely in science do you start on a project that ends up working better than you expect — usually it’s the opposite. This was a rare, magical case,” remarked Schuck. Usually, special sensors are needed to register the light produced by a sample, and it takes some time for them to do so, explained Xu. “With 3R-MoS2, we could see the extremely large enhancement almost immediately,” he said. Notably, the team recorded these conversions at telecom wavelengths, a key feature for potential optical communications applications, such as delivering internet and television services.
    In a fortunate accident during one scan, Xu focused on a random edge of a crystal and saw fringes that suggested waveguide modes were present inside the material. Waveguide modes keep different color photons, which otherwise move at different speeds across the crystal, in sync, and can possibly be used to generate so-called entangled photons, a key component of quantum optics applications. The team handed their devices off to the lab of physicist Dmitri Basov, where his postdoc Fabian Mooshammer confirmed their hunch.
    Currently, the most popular crystal for waveguided conversion and generating entangled photons is lithium niobate, a hard and stiff material that needs to be fairly thick for achieving useful conversion efficiencies. 3R-MoS2 is equally efficient but 100 times smaller and flexible enough that it can be combined with silicon photonic platforms to create optical circuits on chips, following the trajectory of ever-smaller electronics.
    With this proof-of-concept result, the bottleneck toward real-life applications is large-scale production of 3R-MoS2 and high-throughput structuring of devices. There, the team says, industry will need to take over. With this work, they hope they’ve demonstrated the promise of 2D materials.
    “I’ve been working on nonlinear optics for more than thirty years now. Research is most often incremental, slowly building on what came before. It’s rare that you do something completely new with big potential,” said Cerullo. “I have a feeling that this new material could change the game.” More

  • in

    Robotic kidney cancer surgery shows desirable outcomes in study

    Kidney cancer is not always confined to the kidney. In advanced cases, this cancer invades the body’s biggest vein, the inferior vena cava (IVC), which carries blood out of the kidneys back to the heart. Via the IVC, cancer may infiltrate the liver and heart. The Mays Cancer Center at The University of Texas Health Science Center at San Antonio (UT Health San Antonio) is one of the high-volume centers in the U.S. with surgical expertise in treating this serious problem. The Mays Cancer Center is San Antonio’s National Cancer Institute-designated Cancer Center.
    In a study featured on the cover of the Journal of Urology (Official Journal of the American Urological Association), researchers from the Mays Cancer Center and Department of Urology at UT Health San Antonio show that robotic IVC thrombectomy (removal of cancer from the inferior vena cava) is not inferior to standard open IVC thrombectomy and is a highly safe and effective alternative approach. The affected kidney is removed along with the tumor during surgery, which is performed at UT Health San Antonio’s clinical partner, University Hospital.
    Harshit Garg, MD, urologic oncology fellow in the Department of Urology, is first author of the study, and Dharam Kaushik, MD, urologic oncology fellowship program director, is the senior author. Kaushik is an associate professor and the Stanley and Sandra Rosenberg Endowed Chair in Urologic Research at UT Health San Antonio.
    The open surgery requires an incision that begins 2 inches below the ribcage and extends downward on both sides of the ribcage. “It looks like an inverted V,” Kaushik said. Next, organs that surround the IVC, such as the liver, are mobilized, and the IVC is clamped above and below the cancer. In this way, surgeons gain control of the inferior vena cava for cancer resection.
    “Open surgery has an excellent success rate, and most cases are performed in this manner,” Kaushik said. “But now, with the robotic approach, we can achieve similar results with smaller incisions. Therefore, we need to study the implications of utilizing this newer approach.”
    The study is a systematic review and meta-analysis of data from 28 studies that enrolled 1,375 patients at different medical centers. Of these patients, 439 had robotic IVC thrombectomy and 936 had open surgery. Kaushik and his team collaborated with Memorial Sloan Kettering Cancer Center, New York; Cedars-Sinai Medical Center, Los Angeles; and the University of Washington, Seattle, to perform this study.
    “We pulled the data together to make conclusions because, before this, only small studies from single institutions had been conducted to compare the IVC thrombectomy approaches,” Kaushik said.
    Findings
    The results are encouraging and indicate further study of robotic IVC thrombectomy is warranted. The robotic approach in comparison with open was associated with: Fewer blood transfusions: 18% of robotic patients required transfusions compared to 64% of open patients. Fewer complications: 5% of robotic patients experienced complications such as bleeding compared to 36.7% of open thrombectomy patients.These large, technically challenging surgeries last eight to 10 hours and involve a multidisciplinary team of vascular surgeons, cardiac surgeons, transplant surgeons and urologic oncology surgeons, Kaushik said.
    “This study is the largest meta-analysis analyzing the outcomes of robotic versus open IVC thrombectomy,” Kaushik said. “In more than 1,300 patients, we found that overall complications were lower with the robotic approach and the blood transfusion rate was lower with this approach.
    “That tells us there is more room for us to grow and refine this robotic procedure and to offer it to patients who are optimal candidates for it,” Kaushik said. “Optimal candidacy for a robotic surgery should be based on a surgeon’s robotic expertise, the extent and burden of the tumor, and the patient’s comorbid conditions. The open surgical approach remains the gold standard for achieving excellent surgical control.” More

  • in

    Compost to computer: Bio-based materials used to salvage rare earth elements

    What do corncobs and tomato peels have to do with electronics? They both can be used to salvage valuable rare earth elements, like neodymium, from electronic waste. Penn State researchers used micro- and nanoparticles created from the organic materials to capture rare earth elements from aqueous solutions.
    Their findings, available online now, will also be published in the November issue of Chemical Engineering Journal.
    “Waste products like corncobs, wood pulp, cotton and tomato peels often end up in landfills or in compost,” said corresponding author Amir Sheikhi, assistant professor of chemical engineering. “We wanted to transform these waste products into micro- or nanoscale particles capable of extracting rare earth elements from electronic waste.”
    Rare earth metals are used to manufacture strong magnets used in motors for electric and hybrid cars, loudspeakers, headphones, computers, wind turbines, TV screens and more. However, mining these metals proves challenging and environmentally costly, according to Sheikhi, as large land areas are required to mine even small amounts of the metals. Instead, efforts have turned to recycling the metals from electronic waste items like old computers or circuit boards.
    The challenge lies in efficiently separating the metals from refuse, Sheikhi said.
    “Using the organic materials as a platform, we created highly functional micro- and nanoparticles that can attach to metals like neodymium and separate them from the fluid that surrounds them,” Sheikhi said. “Via electrostatic interactions, the negatively-charged micro- and nano-scale materials bind to positively-charged neodymium ions, separating them.”
    To prepare the experiment, Sheikhi’s team ground up tomato peel and corncob and cut wood pulp and cotton paper into small, thin pieces and soaked them in water. Then, they chemically reacted these materials in a controlled fashion to disintegrate them into three distinct fractions of functional materials: microproducts, nanoparticles and solubilized biopolymers. Adding the microproducts or nanoparticles to neodymium solutions triggered the separation process, resulting in the capture of neodymium samples. More

  • in

    Building blocks of the future for photovoltaics

    An international research team led by the University of Göttingen has, for the first time, observed the build-up of a physical phenomenon that plays a role in the conversion of sunlight into electrical energy in 2D materials. The scientists succeeded in making quasiparticles — known as dark Moiré interlayer excitons — visible and explaining their formation using quantum mechanics. The researchers show how an experimental technique newly developed in Göttingen, femtosecond photoemission momentum microscopy, provides profound insights at a microscopic level, which will be relevant to the development of future technology. The results were published in Nature.
    Atomically thin structures made of two-dimensional semiconductor materials are promising candidates for future components in electronics, optoelectronics and photovoltaics. Interestingly, the properties of these semiconductors can be controlled in an unusual way: like Lego bricks, the atomically thin layers can be stacked on top of each other. However, there is another important trick: while Lego bricks can only be stacked on top — whether directly or twisted at an angle of 90 degrees — the angle of rotation in the structure of the semiconductors can be varied. It is precisely this angle of rotation that is interesting for the production of new types of solar cells.
    However, although changing this angle can reveal breakthroughs for new technologies, it also leads to experimental challenges. In fact, typical experimental approaches have only indirect access to the moiré interlayer excitons, therefore, these excitons are commonly termed “dark” excitons. “With the help of femtosecond photoemission momentum microscopy, we actually managed to make these dark excitons visible,” explains Dr. Marcel Reutzel, junior research group leader at the Faculty of Physics at Göttingen University. “This allows us to measure how the excitons are formed at a time scale of a millionth of a millionth of a millisecond. We can describe the dynamics of the formation of these excitons using quantum mechanical theory developed by Professor Ermin Malic’s research group at Marburg.”
    “These results not only give us a fundamental insight into the formation of dark Moiré interlayer excitons, but also open up a completely new perspective to enable scientists to study the optoelectronic properties of new and fascinating materials,” says Professor Stefan Mathias, head of the study at Göttingen University’s Faculty of Physics. “This experiment is ground-breaking because, for the first time, we have detected the signature of the Moiré potential imprinted on the exciton, that is, the impact of the combined properties of the two twisted semiconductor layers. In the future, we will study this specific effect further to learn more about the properties of the resulting materials.”
    This research was made possible thanks to the German Research Foundation (DFG) who provided Collaborative Research Centre funding for the CRCs “Control of Energy Conversion on Atomic Scales” and “Mathematics of Experiment” in Göttingen, and the CRC “Structure and Dynamics of Internal Interfaces” in Marburg.
    Story Source:
    Materials provided by University of Göttingen. Note: Content may be edited for style and length. More

  • in

    No one-size-fits-all artificial intelligence approach works for prevention, diagnosis or treatment using precision medicine

    A Rutgers analysis of dozens of artificial intelligence (AI) software programs used in precision, or personalized, medicine to prevent, diagnose and treat disease found that no program exists that can be used for all treatments.
    “Precision medicine is one of the most trending subjects in basic and medical science today,” said Zeeshan Ahmed, an assistant professor of medicine at Rutgers Robert Wood Johnson Medical School who led the study, published in Briefings in Bioinformatics. “Major reasons include its potential to provide predictive diagnostics and personalized treatment to variable known and rare disorders. However, until now, there has been very little effort exerted in organizing and understanding the many computing approaches to this field. We want to pave the way for a new data-centric era of discovery in health care.”
    Precision medicine, a technology still in its infancy, is an approach to treatment that uses information about an individual’s medical history and genetic profile and relates it to the information of many others to find patterns that can help prevent, diagnose or treat a disease. The AI-based approach rests on a high level of both computing power and machine-learning intelligence because of the enormous scope of medical and genetic information scoured and analyzed for patterns.
    The comparative and systematic review, believed by the authors to be one of the first of its kind, identified 32 of the most prevalent precision medicine AI approaches used to study preventive treatments for a range of diseases, including obesity, Alzheimer’s, inflammatory bowel disease, breast cancer and major depressive disorder. The bevy of AI approaches analyzed in the study — the researchers combed through five years of high-quality medical literature — suggest the field is advancing rapidly but is suffering from disorganization, Ahmed said.
    In AI, software programs simulate human intelligence processes. In machine learning, a subcategory of AI, programs are designed to “learn” as they process more and more data, becoming ever more accurate at predicting outcomes. The effort rests on algorithms, step-by-step procedures for solving a problem or performing a computation.
    Researchers such as Ahmed, who conducts studies on cardiovascular genomics at the Rutgers Institute for Health, Health Care Policy and Aging Research (IFH), are racing to collect and analyze complex biological data while also developing the computational systems that undergird the endeavor.
    Because the use of genetics is “arguably the most data-rich and complex component of precision medicine,” Ahmed said, the team focused especially on reviewing and comparing scientific objectives, methodologies, data sources, ethics and gaps in approaches used.
    Those interested in precision medicine, he said, can look to the paper for guidance as to which AI programs may be best suited for their research.
    To aid the advent of precision medicine, the study concluded that the scientific community needs to embrace several “grand challenges,” from addressing general issues such as improved data standardization and enhanced protection of personal identifying information to more technical issues such as correcting for errors in genomic and clinical data.
    “AI has the potential to play a vital role to achieve significant improvements in providing better individualized and population healthcare at lower costs,” Ahmed said. “We need to strive to address possible challenges that continue to slow the advancements of this breakthrough treatment approach.”
    Other Rutgers researchers involved in the study included Sreya Vadapalli and Habiba Abdelhalim, research assistants at the IFH, and Saman Zeeshan, a bioinformatics research scientist and former postdoctoral research associate at the Rutgers Cancer Institute of New Jersey.
    Story Source:
    Materials provided by Rutgers University. Original written by Kitta MacPherson. Note: Content may be edited for style and length. More

  • in

    Physics of high-temperature superconductors untangled

    When some materials are cooled to a certain temperature, they lose electric resistance, becoming superconductors.
    In this state, an electric charge can course through the material indefinitely, making superconductors a valuable resource for transmitting high volumes of electricity and other applications. Superconductors ferry electricity between Long Island and Manhattan. They’re used in medical imaging devices such as MRI machines, in particle accelerators and in magnets such as those used in maglev trains. Even unexpected materials, such as certain ceramic materials, can become superconductors when cooled sufficiently.
    But scientists previously have not understood what occurs in a material to make it a superconductor. Specifically, how high-temperature superconductivity, which occurs in some copper-oxide materials, works hasn’t been previously understood. A 1966 theory examining a different type of superconductors posited that electrons which spin in opposite directions bind together to form what’s called a Cooper pair and allow electric current to pass through the material freely.
    A pair of University of Michigan-led studies examined how superconductivity works, and found, in the first paper, that about 50% of superconductivity can be attributed to the 1966 theory — but the reality, examined in the second paper, is a bit more complicated. The studies, led by recent U-M doctoral graduate Xinyang Dong and U-M physicist Emanuel Gull, are published in Nature Physics and the Proceedings of the National Academy of Science.
    Electrons floating in a crystal need something to bind them together, Gull said. Once you have two electrons bound together, they build a superconducting state. But what ties these electrons together? Electrons typically repel each other, but the 1966 theory suggested that in a crystal with strong quantum effects, the electron-electron repulsion is being screened, or absorbed, by the crystals.
    While the electron repulsion is absorbed by the crystal, an opposite attraction emerges from the spinning properties of the electrons — and causes the electrons to bind in Cooper pairs. This underlies the lack of electronic resistivity. However, the theory doesn’t account for complex quantum effects in these crystals. More

  • in

    Scientists unravel 'Hall effect' mystery in search for next generation memory storage devices

    An advance in the use of antiferromagnetic materials in memory storage devices has been made by an international team of physicists.
    Antiferromagnets are materials that have an internal magnetism caused by the spin of electrons, but almost no external magnetic field. They are of interest because of their potential for data storage since absence of this external (or ‘long range’) magnetic field means the data units — bits — can be packed in more densely within the material.
    This is in contrast to ferromagnets, used in standard magnetic memory devices. The bits in these devices do generate long-range magnetic fields, which prevent them being packed too closely, because otherwise they would interact.
    The property that is measured to read out an antiferromagnetic bit is called the Hall effect, which is a voltage that appears perpendicular to the applied current direction. If the spins in the antiferromagnet are all flipped, the Hall voltage changes sign. So one sign of the Hall voltage corresponds to a ‘1’, and the other sign to a ‘0’ — the basis of binary code used in all computing systems.
    Although scientists have known about the Hall effect in ferromagnetic materials for a long time, the effect in antiferromagnets has only been recognised in the past decade or so and is still poorly understood.
    A team of researchers at the University of Tokyo, in Japan, Cornell and Johns Hopkins Universities in the USA and the University of Birmingham in the UK have suggested an explanation for the ‘Hall effect’ in a Weyl antiferromagnet (Mn3Sn), a material which has a particularly strong spontaneous Hall effect.
    Their results, published in Nature Physics, have implications for both ferromagnets and antiferromagnets — and therefore for next generation memory storage devices overall.
    The researchers were interested in Mn3Sn because it is not a perfect antiferromagnet, but does have a weak external magnetic field. The team wanted to find out if this weak magnetic field was responsible for the Hall effect.
    In their experiment, the team used a device invented by Doctor Clifford Hicks, at the University of Birmingham, who is also a co-author on the paper. The device can be used to apply a tunable stress to the material being tested. By applying this stress to this Weyl antiferromagnet, the researchers observed that the residual external magnetic field increased.
    If the magnetic field were driving the Hall effect, there would be a corresponding effect on the voltage across the material. The researchers showed that, in fact, the voltage does not change substantially, proving that the magnetic field is not important. Instead, they concluded, the arrangement of spinning electrons within the material is responsible for the Hall effect.
    Clifford Hicks, co-author on the paper at the University of Birmingham, said: “These experiments prove that the Hall effect is caused by the quantum interactions between conduction electrons and their spins. The findings are important for understanding — and improving — magnetic memory technology.”
    Story Source:
    Materials provided by University of Birmingham. Note: Content may be edited for style and length. More