0

New research suggests innovative method to analyse the densest star systems in the Universe

Posted on

New research suggests innovative method to analyse the densest star systems in the Universe
Artist’s illustration of supernova remnant Credit: Pixabay

In a recently published study, a team of researchers led by the ARC Centre of Excellence for Gravitational Wave Discovery (OzGrav) at Monash university suggests an innovative method to analyse gravitational waves from neutron star mergers, where two stars are distinguished by type (rather than mass), depending on how fast they’re spinning.


Neutron stars are extremely dense stellar objects that form when giant stars explode and die—in the explosion, their cores collapse, and the protons and electrons melt into each other to form a remnant neutron star.

In 2017, the merging of two neutron stars, called GW170817, was first observed by the LIGO and Virgo gravitational-wave detectors. This merger is well-known because scientists were also able to see light produced from it: high-energy gamma rays, visible light, and microwaves. Since then, an average of three scientific studies on GW170817 have been published every day.

In January this year, the LIGO and Virgo collaborations announced a second neutron star merger event called GW190425. Although no light was detected, this event is particularly intriguing because the two merging neutron stars are significantly heavier than GW170817, as well as previously known double neutron stars in the Milky Way.

Scientists use gravitational-wave signals—ripples in the fabric of space and time—to detect pairs of neutron stars and measure their masses. The heavier neutron star of the pair is called the ‘primary’; the lighter one is ‘secondary’.

The recycled-slow labelling scheme of a binary neutron star system

A binary neutron star system usually starts with two ordinary stars, each around ten to twenty times more massive than the Sun. When these massive stars age and run out of ‘fuel’, their lives end in supernova explosions that leave behind compact remnants, or neutron stars. Each remnant neutron star weighs around

0

After Year on Ice, the Biggest Arctic Research Mission Is Done

Posted on

The Polarstern amidst Arctic sea ice.

The Polarstern amidst Arctic sea ice.
Photo: NOAA, University of Colorado, Boulder, and MOSAiC

The largest Arctic research campaign in history just came to a close. For more than a year, a rotating group of roughly 500 scientists and staffers have been traveling the region on a research vessel called the Polarstern as part of the Multidisciplinary drifting Observatory for the Study of Arctic Climate expedition, or MOSAiC.

The expedition began last September, when a team packed the ship with 1 million pounds of equipment and set off from Norway toward the North Pole. They then attached the vessel to an ice floe north of Siberia and let it carry them westward for thousands of miles. This allowed the multidisciplinary group of researchers to closely observe the Arctic’s air, ice, and ecosystems to learn more about them and their bearing on our changing climate.

The team studied everything from zooplankton and polar bears to sea ice and wind patterns. Along the way, they encountered many difficulties. At several points, for instance, the ice broke up more than they expected it would and forced them to change their planned path. They also saw dangerous storms, which in more than one case damaged their equipment. At one point, an Arctic fox chewed through data cables—seriously. And of course, there was the covid-19 pandemic, which forced them to pause the expedition for three weeks after a crew member getting ready to deploy to the vessel tested positive, delaying some of their research.

Illustration for article titled After More Than a Year on the Ice, the Biggest Arctic Research Mission Is Complete

Photo: Lianna Nixon, CIRES/University of Colorado, Boulder

The unique nature of the science and circumstances of the pandemic wasn’t the only time the expedition made news; its controversial, sexist dress code prohibiting women from wearing tight clothes also garnered backlash. Despite these challenges, the scientists arrived back on

0

Nearly Half of South America’s Mammals Came from North America, New Research May Explain Why | Smart News

Posted on

North and South America haven’t always been connected. South America functioned as a continent-sized island for millions of years following the extinction of the dinosaurs, incubating its own strange assembly of animals such as giant ground sloths, massive armored mammals akin to armadillos and saber-toothed marsupial carnivores. Meanwhile, North America was exchanging animals with Asia, populating it with the ancestors of modern horses, camels and cats, writes Asher Elbein for the New York Times.

Finally, when tectonic activity formed the Isthmus of Panama roughly ten million years ago, a massive biological exchange took place. The many species that had been evolving in isolation from one another on both continents began migrating across the narrow new land bridge. Llamas, raccoons, wolves and bears trekked south, while armadillos, possums and porcupines went north.

It would be reasonable to expect this grand biological and geological event, known to paleontologists as the Great American Biotic Interchange, resulted in equal numbers of northern and southern species spreading across the two land masses; but that’s not what happened.

Instead, many more North American mammal species made homes down south than the other way around. Almost half of living South American mammals have North American evolutionary roots, whereas only around ten percent of North American mammals once hailed from South America. Now, researchers who reviewed some 20,000 fossils may have an answer, according to the Times.

According to the paper, published this week in the journal the Proceedings of the National Academy of Sciences, the asymmetry of immigrant mammal diversity we see today was the result of droves of South American mammals going extinct, leaving gaping ecological holes waiting to be filled by northern species and reducing the pool of potential immigrant species to make the trek north, reports Christine Janis, an ecologist at

0

Programmable medicine is the goal for new bio-circuitry research

Posted on

Programmable medicine is the goal for new bio-circuitry research
Researchers Holt and Kwong. Credit: Georgia Tech Institute for Electronics and Nanotechnology

In the world of synthetic biology, the development of foundational components like logic gates and genetic clocks has enabled the design of circuits with increasing complexity, including the ability to solve math problems, build autonomous robots, and play interactive games. A team of researchers at the Georgia Institute of Technology is now using what they’ve learned about bio-circuits to lay the groundwork for the future of programmable medicine.


Looking like any other small vial of clear liquid, these programmable drugs would communicate directly with our biological systems, dynamically responding to the information flowing through our bodies to automatically deliver proper doses where and when they are needed. These future medicines might even live inside us throughout our lives, fighting infection, detecting cancer and other diseases, essentially becoming a therapeutic biological extension of ourselves.

We are years away from that, but the insights gained from research in Gabe Kwong’s lab are moving us closer with the development of “enzyme computers”—engineered bio-circuits designed with biological components, with the capacity to expand and augment living functions.

“The long-term vision is this concept of programmable immunity,” said Kwong, associate professor in the Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory University, who partnered with fellow researcher Brandon Holt on the paper, “Protease circuits for processing biological information,” published Oct. 6 in the journal Nature Communications. The research was sponsored by the National Institutes of Health.

Programmable medicine is the goal for new bio-circuitry research
Analog-to-digital converter. Credit: Georgia Tech Institute for Electronics and Nanotechnology

The story of this paper begins two years ago when, Holt said, “our lab has a rich history of developing enzyme-based diagnostics; eventually we started thinking about these systems as computers, which led us to design simple logic gates, such as AND

0

University in Gujarat gets Supercomputer ‘Param’; To help in AI, big data education and research

Posted on



a group of people sitting posing for the camera: At present, Marwadi University has 400 students every year taking admission in the Computer & IT Department and 60 students in Computer Applications.


© Provided by The Financial Express
At present, Marwadi University has 400 students every year taking admission in the Computer & IT Department and 60 students in Computer Applications.

By Dr. RB Jadeja

The Gujarat Council on Science and Technology (GUJCOST) of the Gujarat Government has granted Marwadi University with supercomputer ‘Param’. With this, Marwadi University is amongst only a handful of universities in the state to have the facility of a supercomputer. This facility will be equipped with a Param Shavak system developed at the Centre for Development of Advanced Computing (C-DAC) for high-performance computing and deep learning with x86 based latest Intel processor, 98 GB RAM, 16 TB storage, Nvidia based co-processing accelerator technologies, and software development environment.

Supercomputers support computation of large datasets, terabytes, or petabytes of data, with billions and billions of mathematical operations over several times faster than a regular Laptop or Desktop. The distinguishing feature behind supercomputing with magnified speed is the parallel processing of the computing operations.

With such facilities, the students of UG/PG level will receive training and lectures on emerging topics in HPC, DL, and software development, including parallel programming, programming models, algorithms, hardware architecture, and its impact on code design choices, high-quality software development in collaborative environments, visualization, and workflow, additionally creating a research opportunity for the faculty members/ researcher.

At present, Marwadi University has 400 students every year taking admission in the Computer & IT Department and 60 students in Computer Applications. Estimated around 20% of these students undertake projects on AI/ML/DL or which some may utilize supercomputing facilities then approximately 90 students are directly involved. Besides, 100 students every year in AI & Big Data class whose projects are directly involved with such a facility. Over and above, 600 students from Electrical, Mechanical, Pharmacy, and Science streams of

1 2