What causes the body to age?


What causes the body to age?

The Greek Philosopher Aristotle thought it was the heart—a “hot, dry” organ at the seat of intelligence, motion, and sensation.

Fast-forward a few centuries, and the brain has overthrown the heart as master of thought. But its control over bodily aging—if any—was unclear. Because each organ has its own pool of stem cells to replenish aged tissue, scientists have long thought that the body has multiple “aging clocks” running concurrently.

As it turns out, that’s not quite right.

This week, a study published in Nature threw a wrench into the classical theory of aging. In a technical tour-de-force, a team led by Dr. Dongsheng Cai from the Albert Einstein College of Medicine pinpointed a critical source of aging to a small group of stem cells within the hypothalamus—an “ancient” brain region that controls bodily functions such as temperature and appetite.

Like fountains of youth, these stem cells release tiny fatty bubbles filled with mixtures of small biological molecules called microRNAs. With age, these cells die out, and the animal’s muscle, skin and brain function declines.

However, when the team transplanted these stem cells from young animals into a middle-aged one, they slowed aging. The recipient mice were smarter, more sociable and had better muscle function. And—get this—they also lived 10 to 15 percent longer than mice transplanted with other cell types.

To Dr. David Sinclair, an aging expert at Harvard Medical School, the findings represent a “breakthrough” in aging research.

“The brain controls aging,” he says. “I can see a day when we are implanted with stem cells or treated with stem cell RNAs that improve our health and extend our lives.”

Hypothalamus: The Ancient Brain

It’s incredible to think that a tiny group of cells in one brain region could be the key to aging.

But to Cai, there are plenty of examples throughout evolution that support the theory. Experimentally changing a few of the 302 neurons in the nematode worm C. elegans is often sufficient for changing its lifespan, he says.

Of course, a mammalian brain is much more complicated than a simple worm. To narrow the problem down, Cai decided to zero in on the hypothalamus.

“The hypothalamus has a classical function to regulate the whole body’s physiology,” he says, “so there’s a natural logic for us to reason that the hypothalamus might be involved in aging, which was never studied before.”

Even so, it was a high-risk bet. The hippocampus—because of its importance in maintaining memory with age—is the most popular research target. And while the hypothalamus was previously somehow linked to aging, no one knew how.

Cai’s bet paid off. In a groundbreaking paper published in 2013, he found that a molecule called NF-kappaB increased in the hypothalamus as an animal grew older. Zap out NF-kappaB activity in mice, and they showed much fewer age-related symptoms as they grew older.

But here’s the kicker: the effects weren’t limited to brain function. The animals also better preserved their muscle strength, skin thickness, bone and tendon integrity. In other words, by changing molecules in a single part of the brain, the team slowed down signs of aging in the peripheral body.

Stranger Cells

But to Cai, he had only solved part of the aging puzzle.

At the cellular level, a cornucopia of factors control aging. There is no the key to aging, no single molecule or pathway that dominates the process. Inflammation, which NF-kappaB regulates, is a big contributor. As is the length of telomeres, the protective end caps of DNA, and of course, stem cells.

Compared to other tissues in the body, stem cells in the brain are extremely rare. So imagine Cai’s excitement when, just a few years ago, he learned that the hypothalamus contains these nuggets of youth.

Now we can put the two threads together, and ask whether stem cells in the hypothalamus somehow regulate aging, he says.

In the first series of experiments, his team found that these stem cells, which line a V-shaped region of the hypothalamus, disappear as an animal ages.

To see whether declined stem cell function contributes to aging, rather as a result of old age, the researchers used two different types of toxins to wipe out 70 percent of stem cells while keeping mature neurons intact.

The results were striking. Over a period of four months, these mice aged much faster: their muscle endurance, coordination, and treadmill performance tanked. Mentally, they had trouble navigating a water maze and showed less interest in socializing with other mice.

“All of these physiological changes reflected an acceleration in aging,” Cai and team concluded in their article.

And the consequences were dire: the animals died months earlier than similar transgenic animals without the toxin treatment.

Spring Back

If the decline in stem cell function is to blame for aging, then resupplying the aged brain with a fresh source of stem cells should be able to reinvigorate the animal.

To test this idea, the team isolated stem cells from the hippocampus of newborn mice, and tinkered with their genes so that they were more resilient to inflammation.

We know the aged hypothalamus has more inflammation and that hurts stem cells, so this step was necessary, explained the authors.

When transplanted into middle-aged mice, they showed better cognitive and muscular function four months later. What’s more, they lived, on average, 10 percent longer than mice transplanted with other cell types. For a human, that means extending an 85-year life expectancy into 93. Not too shabby.

But the best was yet to come. How can a few cells have such a remarkable effect on aging? In a series of follow-up experiments, the team found that the pool of biological molecules called microRNAs was to thank.

We’re live streaming at SU Global Summit 2017
August 13-15, 2017 | San Francisco
Subscribe now to the Singularity Hub newsletter to get live stream notifications.
You’ll also receive a promo code to save up to 15% on SU Global Summit!

microRNAs are tiny molecules with gigantic influence. They come in various flavors, bearing rather unimaginative names like “106a-5p,” “20a-5p” and so on. But because they can act on multiple genes at the same time, they pack a big punch. A single type of microRNA can change the way a cell works—whether it activates certain signaling pathways or makes certain proteins, for example.

While most cells make microRNAs, Cai found that the hypothalamus stem cells have a “unique, very strong” ability to pack these molecules up into blobs of membrane and shoot them out like a bubble gun.

Once outside the cell, the microRNAs go on a fantastic voyage across the brain and body, where they tweak the biology of other tissues.

In fact, when the team injected purified little bubbles of microRNAs into middle-aged mice, they also saw broad rejuvenating effects.

Cai explains: we don’t know if the microRNAs are pumped out to directly affect the rest of the body, or if they first act on different areas of the brain, and the brain goes on to regulate aging in the body.

Forever Young

Even so, the aging field is intrigued.

According to Dr. Leonard Guarente, an aging biologist at MIT, the study could lead to new ways to develop anti-aging therapies.

What’s more, it’s possible the intervention could stack with other known rejuvenating methods, such as metforminyoung blood or molecules that clean out malfunctioning cells.

It’s possible that stem-cell therapy could boost the hypothalamus’ ability to regulate aging. However, scientists still need to know how stem cells link with the hypothalamus’ other main role, that is, releasing hormones.

Of course, injecting cells into the brain isn’t a practical treatment. The team is now working hard to identify which of the thousands of types of microRNAs control aging and what exactly they do.

Then the goal is to validate those candidate anti-aging microRNAs in primates, and eventually, humans.

“Of course humans are more complex. However, if the mechanism is fundamental, you might expect to see effects when an intervention is based on it,” says Cai.

Shelly Xuelai Fan is a neuroscientist at the University of California, San Francisco, where she studies ways to make old brains young again. In addition to research, she’s also an avid science writer with an insatiable obsession with biotech, AI and all things neuro. She spends her spare time kayaking, bike camping and getting lost in the woods.

Guide to Deep Learning and Artificial Intelligence

insideBIGDATA Guide to Deep Learning and Artificial Intelligence

The insideBIGDATA Guide to Deep Learning & Artificial Intelligence is a useful new resource directed toward enterprise thought leaders who wish to gain strategic insights into this exciting area of technology. In this guide, we take a high-level view of AI and deep learning in terms of how it’s being used and what technological advances have made it possible. We also explain the difference between AI, machine learning and deep learning, and examine the intersection of AI and HPC. We also present the results of a recent insideBIGDATA survey to explore how well these new technologies are being received. Finally, we take a look at a number of high-profile use case examples showing the effective use of AI in a variety of problem domains.

Deep Learning and AI – An Overview
This is the epoch of artificial intelligence (AI), when the technology came into its own for the mainstream enterprise. AI-based tools are pouring into the marketplace, and many well-known names have committed to adding AI solutions to their product mix—General Electric is pushing its AI business called Predix, IBM runs ads featuring its Watson technology talking with Bob Dylan, and CRM giant Salesforce released an AI addition to their products, a system called Einstein that provides insights into what sales leads to follow and what products to make next.

These moves represent years of collective development effort and billions of dollars in terms of investment. There are big pushes for AI in manufacturing, transportation, consumer finance, precision agriculture, healthcare & medicine, and many other industries including the public sector.

AI is becoming important as an enabling technology, and as a result the U.S. federal government recently issued a policy statement, “Preparing for the Future of AI” from the “Subcommittee on Machine Learning and Artificial Intelligence,” to provide technical and policy advice on topics related to AI.

Perhaps the biggest question surrounding this new-found momentum is “Why now?” The answer centers on both the opportunity that AI represents as well as the reality of how many companies are afraid to miss out on potential benefit. Two key drivers of AI progress today are: scale of data, and (ii) scale of computation. It was only recently that technologists have figured out how to scale computation to build deep learning algorithms that can take effective advantage of voluminous amounts of data.

One of the big reasons why AI is on its upward trajectory is the rise of relatively inexpensive compute resources. Machine learning techniques like artificial neural networks were widely used in the 1980s and early 1990s, but for various reasons their popularity diminished in the late 1990s. More recently, neural networks have had a major resurgence. A central factor for why their popularity waned is because a neural network is a computationally expensive algorithm. Today, computers have become fast enough to run large scale neural networks. Since 2006, advanced neural networks have been used to realize methods referred to as Deep Learning. Now, with the adoption of GPUs (the graphics processing unit originally designed 10 years ago for gaming), neural network developers can now run deep learning with compute power required to bring AI to life quickly. Cloud and GPUs are merging as well, with AWS, Azure and Google now offering GPU access in the cloud.

There are many flavors of AI: neural networks, long short-term memories (LSTM), Bayesian belief networks, etc. Neural networks for AI are currently split between two distinct workloads, training and inference. Commonly, training takes much more compute performance and uses more power, and inference (formerly known as scoring) is the opposite. Generally speaking, leading edge training compute is dominated by NVIDIA GPUs, whereas legacy training compute (before the use of GPUs) by traditional CPUs. Inference compute is divided across the Intel CPU, Xilinx/Altera FPGA, NVIDIA GPU, ASICs like Google TPU and even DSPs.

topics:
Deep Learning and AI – An Overview
The Difference between AI, Machine Learning and Deep Learning
The Intersection of AI and HPC
Are AI/Machine Learning/Deep Learning in Your Company’s Future?

The Difference between AI, Machine Learning and Deep Learning
Accelerating Analytics for the Enterprise
Are AI/Machine Learning/Deep Learning in Your Company’s Future?
The Intersection of AI and HPC
Bringing Artificial Intelligence to Life
Filed Under: AI Deep Learning, Big Data, Google News Feed, Main Feature, News / Analysis, Research / Reports, Uncategorized, White Papers Tagged With: AI Deep Learning Guide Series, artificial intelligence, Deep Learning, Weekly Featured Newsletter Post

insideBIGDATA Guide to Deep Learning and Artificial Intelligence

    Artificial emotional intelligence

  • Personal Transformation Mastery MRR Discover The Complete 10-Part Step-By-Step Plan To Transform Your Life And Become a Better You. Finally! Create a Meaningful Life, Master Your Brain, Overcome Your Fears, Remove Self-Doubt, Build Confidence and Much More!
  • Stuna YouTube Video First Page Checker We all know that Youtube videos rank highly in Google. This is a research tool that will help you find those “video keywords” – keywords with Youtube videos on page one. You can export the data to a CSV file that will show you more info (ranking, url, vid