The slow escalator ride to daftness
Decline. The shadow of death. It's coming and for most of us it's going to be an agonisingly slow, shopping centre escalator ride to below ground level.
Not a boy, not yet a man (only psychologically), even I find myself at a loss to remember things. Who's that daft bloke staring back at me in the mirror? What day is it? Where did I leave my keys? When did I last cut my toenails? Why does it always rain on me? Do you believe in life after love? These are all relatable questions we ask ourselves every day, and the fact that the questions themselves and their answers are only going to be harder to come by is far from reassuring.
As we age, we experience a general decline in our brain functioning, what we term cognitive decline. It most profoundly affects our learning, memory and movement. You've seen your daft, old granny (see previous article for further details). This cognitive decline usually begins during mid-life crises, the kind of ironic adding of insult to injury that I can only assume the universe revels in. By mid-60s, this rate of decline steepens, along with other age-related concerns. In the fight to stay cognizant, there are ways we can mitigate against the effects of decline by reducing scientifically proven risks. For example, a better diet, more physical activity, brain training and meeting up with old friends. Equally, the reverse of these will increase our risk, along with smoking and poor sleep. This is all tried and tested medical advice, but what the doctors often fail to mention as that a certain amount of your fate is determined by the inherited genetic component of cognitive decline. Recently, I was tasked with exploring this further by uncovering the genes associated with decline. Through this process I became impervious to any form of memory loss - unless it involves remembering what happened in that series I just finished on Netflix.
Using a cohort of twins, I analysed the expression of genes in their blood, fat, immune and skin cells. They had also been tested using a battery of cognitive tests, once in 1999 and again in 2009. Mathematical wizardry elsewhere in the department managed to represent their cognitive decline between these dates using an age-related change (ARC) score. As you'd expect, most of their scores showed a level of depreciation. The scores of one of these tests, the paired associates learning (PAL) test, is commonly used as a marker for Alzheimer's disease, so I studied the change in this score specifically. Both of these scores were correlated with age, which suggested they were good measures of decline.
Firstly, gene expression in skin was most strongly linked to cognitive decline for both tests. Basically, this is most likely because the skin acts as the interface between environment and the organism, and therefore is most likely to suffer the most extreme effects of environmentally induced ageing. This is particularly relevant if you consider the bracing fluctuations in weather, we Brits have to endure on an almost hourly basis (see summer for more details). Certain genes were significant from statistical analysis, and when we enriched these genes, i.e. grouped them together, we found a number of processes that might be responsible for this age-related change in gene expression. From a list of around 20 genes, the analysis returned 28 possible processes. 9 of these concerned tumour necrosis factor signalling (TNF), which isn't quite as scary it sounds because it's integral to immune system function. A gene called TRAF2 seemed to be driving these results, which showed lower expression in people who experienced greater cognitive decline. TRAF2 inhibits TNF signalling, so when you reduce its expression, you get more TNF signalling. TNF signalling induces inflammation and cell death, therefore, more of it could lead to increased brain cell death. TNF signalling has already been shown to increase with age in other studies, and been linked to cognitive disorders, Alzheimer's disease, and depression, as well as other age-related diseases such as cancer and heart disease.
Now before you start panicking it's important to remember that at this stage this is only a correlation. The study mainly used elderly women, a small sample size and gene expression wasn't in brain tissue. Although, do you realise how hard it is to get brain tissue? "Hi, Ms. Participant-Lady, I'm a scientist, would you mind voluntarily letting me stick a needle through your skull a few times over the next couple of years for science?" We have to use other tissues as an approximation, as limiting as they may be. In the future, we will need more participants to try and reproduce results in other datasets. We would perhaps use more invasive but accurate representations of cognitive decline, and maybe pick up our old friend machine learning to see if any of this is actually predictive of neurological disease. If we can do these things, the hope is that we might be able slow the rate of decline and incidence of disease, in an ageing and increasingly daft population. Until then, I suggest downloading a reasonably priced brain training app to peruse - that is, when you're not exercising with friends and nutritional meals. HAHA, I know right!
Comments
Post a Comment