Dementia encompasses a range of diseases which, over time, cause changes to the brain and mostly impact the elderly populations. Unlike other types of dementia, most notably Alzheimer’s disease, frontotemporal dementia predominantly affects speech and behavioural management and has an earlier onset than other varieties. The disease is labelled as frontotemporal as it is characterised by damage to both the frontal lobe and temporal lobe regions of the brain which are at the front and sides. Social interaction and personality traits are key functions of the frontal lobe and these are drastically impacted in patients with frontotemporal dementia.
The main cause of frontotemporal dementia is atrophy- shrinking- of these regions of the brain, resulting in changes to behaviour, language and social interactions, which they are responsible for. Although it is a rare form of dementia, frontotemporal lobe is often highly noticeable to friends and family of the patient and this results in early diagnoses, typically between the ages of 45 and 65 when the majority of dementia cases usually occur in over 65 year olds. Despite some genetic mutations having been linked to frontotemporal lobe dementia, roughly 50% of patients have no existing records of the disease in their families. Upcoming research has supposedly found connections between frontotemporal types of dementia specifically and a type of sclerosis, amyotrophic lateral sclerosis (ALS).
Frontotemporal dementia is most commonly noticed by loved ones first as the patient’s social manners and behaviour gradually seem more erratic or inappropriate for their setting. This can be difficult for healthcare professionals to detect without knowing the patient well, meaning observation from close family and friends is vital to an accurate diagnosis. Symptoms often worsen over a few years, and apathy towards previous interests, lack of empathy and changes to their behaviour can be obvious symptoms. Despite these being more uncommon, frontotemporal dementia can also cause symptoms more common to other types of the disease such as cognitive problems and lack of mobility. Frontotemporal dementia can occasionally be misinterpreted as aggression, discourtesy and even obsessive behaviours and patients may develop new routines or even lose interest in self care and begin to have disregard for the feelings of others.
Language can also be severely impacted by this form of dementia, where patients lose their vocabularies, mistake objects, struggle to form sentences and even copy what someone else says in conversation. Sufferers also often lack the ability to organise themselves or see other perspectives, becoming distracted and agitated very quickly and unable to stick to plans. Combined with these mental changes, those with frontotemporal dementia may also become incontinent and struggle to be mobile without difficulties. Although memory and physical impairments can occur, this usually happens when the disease is more progressed and behavioural differences will be observed first.
It can be challenging to get a diagnosis of frontotemporal dementia as the patient may be unwilling to cooperate or even be aggressive and rude when prompted to seek medical advice. If the sufferer suspects they may have changes to their behaviour or physical health, they should consult a GP and a loved one could perhaps urge them to make an appointment and join them for support while they are there. Doctors are able to assess the symptoms and further investigations at a hospital may be required to check whether any noticeable abnormalities are present in the frontal or temporal lobes. Other comorbidities can also be ruled out and this can offer both the patient and their family peace of mind.
Receiving a diagnosis of frontotemporal dementia can be challenging for both the patient and their family. It is common for some to reject the diagnosis and even be offended by it, however it is important that the right support is provided so that their quality of life is preserved and their families are able to manage their symptoms. The multidisciplinary team is likely to advise that a care plan is established, including whether nursing support is required at home, to support carers of those suffering with frontotemporal dementia. Antidepressants are occasionally used to manage symptoms and occupational therapy, speech and language therapy and dementia friendly groups and activities can help patients to remain as comfortable and independent as possible as the disease progresses. Despite it being a rare occurrence, antipsychotics can be prescribed to lessen violence or aggression that may risk the carers’ safety and creating coping mechanisms for challenging behaviour is advisory. Financial support may also be offered to help create changes to the family home to make it as suitable and accessible as possible for the sufferer.
The Human Skull (commonly referred to as the Cranium) is a protective bone structure located around the brain. At first glance, it would seem the skull is simply a protective structure- however as we begin to look more closely at the Skull in foetuses’ and babies, and in older humans, we can begin to understand its changing purpose. In this article, we will be discovering how the seemingly simple protective bone structure has undergone a great journey of change.
Skull at Birth
At birth, all of the major bones of the skull have formed, however they are distinctly separated by connective tissues called the sutures and fontanelles. These tissues serve two purposes at birth. The first ensures that during delivery the skull can change shape(due to flexibility provided by sutures) to enable it to pass through the relatively narrow uterus. If the foetal skull were to not have this connective tissues-it would be extremely difficult to deliver a baby the traditional way and even if delivered- would result in immediate injury to the skull of the baby and uterus of the mother. The second function is to host growth of the brain during the first few months of a baby’s life, and more rapid growth in later years. This ensures that the child does not develop brain damage by the brain growing faster than the skull, causing continuous intracranial pressure on the cerebrum.
Looking more closely at the sutures and fontanelles at birth- the diagram below (Fig 1) shows the superior view of the skull. We can distinctly observe how the frontal bones and parietal bones are separated by the anterior fontanelles and posterior fontanelles respectively. This links back to our 2 previous functions -ensuring the growth of the brain and easier passing of the foetus down the birth canal.
Fig 1. Labeled superior view of the skull at birth. Relevant labels link back to functions of fontanelles and sutures between the bones. Image by Stanford Children’s Health.
Skull at Infancy
For the first18months as a baby learns basic movements, such as twisting the head, holding up its own head, rolling over, and sitting up. However, due to the obvious lack of developed motor skills- these movements are often spontaneous and uncontrolled. This arises an immediate risk of injury, hence the fontanelles and sutures that are in place during birth remain until about 18months. This ensures that there is sufficient protection from these minor impacts from learning movements. As a result, the skull does not change significantly during the first 18 months of a child’s life- maintaining the flexibility to facilitate growth and protect from minor injuries.
However, by the age of 2, the anterior fontanelle begins to close, reducing flexibility between the two frontal bones for the infant. The anterior fontanelle is the largest of the connective tissues, but also one of the most clinically significant tissues. It provides information about the state of health of babies- specifically to do with hydration and intracranial pressure. As a result once at the age of 2, the clinically and structurally significant structure within the skull disappears, removing a source of clinical information and cranial protection.
The Changes to the Mandible
We now shift our attention away from the changing tissue structure of the skull, and toward the changes to the Mandible.
The Mandible is known as the lower jaw and is the largest bone in the lower skull, responsible for supporting the lower teeth and holding them in place. It is an extremely strong bone and is capable of movement (unlike the upper jaw). The Mandible undergoes a series of changes from birth to old age- which we will explore below.
At birth, the Mandible consists of 2 sections of bone, which become joined within 1-2days. The whole body of the Mandible elongates but there is a focus on the growth of the area behind the mental foramen. This is an area responsible for providing space for the growth of the first few teeth. Moreover, there is an increase in depth of the body of the bone in order to facilitate increased room for roots of the teeth. Hence in childhood, we see the growth of specific areas of the Mandible, mostly down to the growth of teeth and their roots in early ages.
At birth, the angle of the mandible is found to be obtuse at 140 degrees or more, whereas in adults it is at an angle of 110-120 degrees due to the presence of a full set of teeth. This naturally raises the volume and hence raises the chief part of the bone just above the oblique line. In old age the mandible returns to an obtuse angle due to the loss of teeth, therefore the chief part of the bone is below the oblique line, and the angle measures at about 140 degrees. Angles of the mandible is very significant in terms of skeletal maturity. Its changes are used by dentists very often to reach a conclusion about dental health and the overall development of the Mandible
Finally, the Mandible goes through great changes in height and length (as seen before in changes due to initial growth of teeth). We shall focus on the changes in old age in this context. As someone gets older they begin to lose their teeth, with the primary culprits for this being receding gums and deterioration of the jawbone. As the teeth are lost in old age- the alveolar process becomes absorbed. The alveolar process is a bone that hosts the tooth sockets on bones which facilitate the teeth ( Fig 3). As this bone becomes absorbed, the mandible reduces in height. Reduction in mandible height has various implications- the most notable of these being a reduction of jaw mobility and a distinct change in facial aesthetic.
In this article, we have discovered that the human skull is not just a large bone surrounding the brain- but an intricate, dynamic structure with numerous adaptations and changes. Ranging from fontanelles and sutures to the versatile Mandible and alveolar process- the skull is packed with different functions and can be classified as the most important bone in our body- responsible for our safety, speech, and facial aesthetic through the ages.
The organ crisis is an ever-prominent issue as globally, the demand for healthy and well-functioning organs significantly exceeds the available supply. Xenotransplantation is a medical procedure whereby living cells, tissues or organs are transplanted from one species to another: a potential solution for the human organ crisis.
In the United States of America alone, ten patients die each day while on the waiting list to receive life saving vital organ transplants.
Xenotransplantation can alleviate this issue, while also providing greater access of transplant organs to ethnic minorities and limiting rejection in grafts. The benefits and risks of this procedure will be evaluated to determine whether xenotransplantation is a potential solution to human organ shortages.
Potential for Xenotransplantation
Today, organ transplantation is mainly undertaken to treat severe failure of vital organs such as the heart, lungs, liver and kidneys. Population aging and increasing sedentary lifestyles will result in an increase of the prevalence of chronic diseases (such as Type 2 Diabetes Mellitus and Coronary Heart Disease) and, consequently, the demand for transplantation. The strain on transplant organ supplies will encourage advances in biomedical breakthroughs, particularly in diagnostics and genomics, to reduce the occurrence of diseases progressing to life-threatening stages where only organ transplantation is viable. However, currently, xenotransplantation is one of the most readily available options to lessen the organ transplant crisis.
Xenografts: Reducing Risk of Rejection
Xenotransplantation not only enables organ transplantation but also the transplantation of tissue and cells (known as xenografts). Various tissues such as animal bone, skin, and foetal neural tissues for xenotransplantation have been suggested to be viable. Animal-to-human transplantation of tissues is certainly less dangerous than organ xenotransplantation as the animal tissues do not have major blood supplies hence the immune response is less vigorous, and the surgical procedure is less risky. Meaning, less damage to the patient is expected due to the lower risk of rejection and infection.
A common example of animal tissue xenotransplantation is biological heart valves. These valves are used since unlike mechanical valves, they are not associated with a higher risk of blood clots, so the intake of blood thinners is not required. Porcine (pig) valves, which are an example of successful xenografts in modern medicine, are frequently used to replace aortic and mitral valves in the heart to treat cardiac valve conditions such as valvular stenosis.
The benefit to Patients of Ethnic Minorities
In addition, xenotransplantation could be hugely beneficial for patients of ethnic minorities, where locating a suitable organ match is difficult. To reduce the risk of rejection, donors and recipients are ‘matched’ – it is vital for the donor and the recipient to have a matching blood group and antigen tissues type which is generally difficult to coordinate with an ethnic minority group.
In 2017, figures revealed that 21 percent of people in the UK who died on the organ transplant waiting list were from a black, Asian, or ethnic minority background compared with 15 percent a decade ago. This problem is then heightened due to the fact that such ethnic minorities constitute a significantly larger proportion of the organ donation waitlist: patients of ethnic minorities are three to four times more likely to develop end-stage renal failure and therefore require a kidney transplant. The huge stress on the supply of organs for ethnic minority patients is clearly evident here, and xenotransplantation has the potential to curb this issue greatly.
Ethical Concerns Against The Procedure
Animal mistreatment and disobedience to animal rights is one of the most significant ethical concerns associated with xenotransplantation, and it is argued that xenotransplantation is a violation of nature to exploit animals for their organs.
The welfare of animals is of great concern as these animals will be subject to trials and experiments before xenotransplantation becomes an available medical procedure. In addition, since there is the possibility of virus transference from animal to human, these animals are raised with special diets and in supervised environments specified for their growth. This has resulted in ethical criticism, as keeping animals imprisoned in an area that is not similar to its natural habitat with synthetic food diets and without other animal interaction does not enable the animals to live naturally.
Xenotransplantation benefits humankind by potentially saving lives and elongating life expectancies to the detriment of animals.
From a religious perspective, a fundamental value is to protect and preserve human life.
Xenotransplantation could be difficult in some religious cultures, such as Judaism and Islam, as swine is prohibited to be consumed, and to receive an organ for transplant from this animal could be considered sinful, disobeying their religious ideologies. Hence, locating another appropriate animal species’ organs for transplantation into humans. However, according to some perspectives of Christianity, Judaism, and Islam, there are not any specific religious fundamental binding reasons which prohibit xenotransplantation to treat grave and life-threatening organ insufficiencies.
If a patient refuses xenotransplanted organs on religious or cultural grounds and still requires an organ to treat their condition, they may need to be prioritized for allotransplantation (human to human transplants), stimulating another ethical dilemma. In the event that xenotransplantation is a reality, the allocation of human and animal organs would have to be thoroughly evaluated, ensuring that allocation is based on clinical need and that the maximum number of people continue to receive transplant organs.
Scientific Concerns Against The Procedure
Physiological Differences in Organs
Size and longevity are two main issues regarding animal organs for xenotransplantation. For example, a pig heart or kidney, when of suitable size for donation, may still have the potential for rapid growth, and the rate of growth of animal organs compared to the growth of human organs is most likely to be significantly different. The differences in organ size will limit the range of potential recipients of xenotransplants. Furthermore, the natural lifestyle is roughly 15 years, and aging in xenotransplanted organs is unknown.
Introduction of Unknown Infections
Xenozoonosis is the transmission of infectious agents between species via xenotransplants and xenografts. Animal to human transmission of pathogens can be extremely rare but past occurrences include avian influenza.
The main reason for the increased likelihood of disease transmission in xenotransplantation is that the implantation of foreign tissues into the human body breaches the physical barriers that usually prevent the transmission of disease. This potentially leaves the recipient of the xenotransplant exposed to a myriad of unknown pathogens which will most likely leave the recipient severely immunosuppressed
Impact on Psychological Well Being
It is possible for the transgenic animal organ transplantation into the human recipient to cause major psychological and personality issues for the recipient. In 2010, a study revealed that individuals can form certain perceptions about their physical shape and their identity, as they struggle with the acceptance of an animal organ transplant, providing controversies for both the patients and society.
Hyperacute rejection occurs because the human antibodies in the blood recognize the foreign antigens on the cells of the xenograft/xenotransplant, which are cells of pigs and all other distantly related species, triggering a rapid immune response. These antibodies bind to the cells lining the blood vessels of the organs as blood flows through the animal tissues, which activates the complement system, attacking the xenograft/xenotransplant. White blood cells are also activated by the complement proteins, and within a few minutes, the animal tissues are reduced to a black, swollen mass: the xenograft/xenotransplant has been rejected.
Although hyperacute rejection can occur in allotransplantation, it is significantly more likely to occur in xenotransplant recipients as the tissues originate from an individual of a different species, resulting in notably different antigens on its cells’ surface. Hyperacute rejection is probably the most difficult scientific concern against xenotransplantation, however, many promising methods to tackle this issue have arisen. One method includes the genetic modification of pigs (transgenic pigs), where, when an organ of the modified pig is transplanted in a human, the human complement regulatory proteins on the cells of the pig organ to inhibit the activation of complement, inhibiting hyperacute rejection.
Application to Fundamentals of Medical Ethics
It is clear that the nature of informed consent is the permission of the patient about the procedure after receiving thorough information about the side effects, and in return, the patient will be safe from treatments that are incompatible with their beliefs or unwanted. Informed consent is infinitesimally significant when regarding medical procedures, particularly ethically and medically controversial procedures such as xenotransplantation.
With future scientific breakthroughs to reduce the risk of rejection and to alter the rate of aging of the tissues, xenotransplantation can become commercially available like allotransplantation, as the risk of further disease to the recipient will be much lower. However, in this age, xenotransplantation of organs remains extremely difficult as more harm to the patient will be done, even if the organ does not remain in the patient’s body on a long-term basis but just to elongate the patient’s life expectancy while they wait for a human transplant organ.
Currently, the risks associated with xenotransplantation potentially outweigh the benefits, mainly due to the lack of scientific uncertainty of whether the transplanted organ can function as desired in the recipient’s body.
The physical risks of xenotransplantation can be life-threatening, hence, the recipient should be under close follow-up for an indefinite period of time, which can affect many aspects of human life, including sexual relationships and nutrition. In addition, recipients may have compromised immune systems and an increased risk of contracting infections, resulting in complete isolation and quarantine from others. This situation is undoubtedly against the most basic human rights – freedom and the establishment of relationships with others.
Several solutions against medical issues concerning xenotransplantation have arisen in recent years, including genetic modification and using transgenic pigs. However, an abundance of research and funding for clinical trials is still required to ensure that the procedure is well understood for the safety of potential recipients.
But currently, in the face of difficulties, such as the unmet balance between organ demand and supply, xenotransplantation may be an attractive option shortly as ethical concerns around anthropocentric views and animal mistreatment still remain. But, if society can accept the ethical obligations associated with xenotransplantation, this procedure undoubtedly has the potential to develop into a legitimate solution to organ shortages.
Morgan, M., Kenten, C., Deedat, S., Farsides, B., Newton, T., Randhawa, G., Sims, J., & Sque, M. (2016). “Increasing the acceptability and rates of organ donation among minority ethnic groups: a programme of observational and evaluative research on Donation, Transplantation and Ethnicity (DonaTE)” NIHR Journals Library.
Behnam Manesh, S., Omani Samani, R., & Behnam Manesh, S. (2014). “Ethical issues of transplanting organs from transgenic animals into human beings” Cell journal, 16(3), 353–360.
Sautermeister, J., Mathieu, R., & Bogner, V. (2015). “Xenotransplantation-theological-ethical considerations in an interdisciplinary symposium” Xenotransplantation, 22(3), 174–182. https://doi.org/10.1111/xen.12163
The use of music for children diagnosed with Autism Spectrum Disorders (ASD) dates back to the 1940s and over time has been adopted by clinicians. This article discusses the condition and its causes and analyses the potential of music therapy to improve the cognitive and social abilities of ASD patients through a neurological lens.
By Ruhana Mahmud
Autism Spectrum Disorder: A brief introduction
Autism Spectrum Disorders (ASD) is a neurodevelopmental condition. Impaired social and communication skills and repetitive, restricted motor activities characterize ASD. The term Autism Spectrum disorders refer to a “spectrum” of disorders including autistic disorder, Asperger’s disorder, childhood disintegrative disorder, and pervasive developmental disorder not otherwise specified (PDD-NOS). Hypo or hyperactivity to sensory stimuli and unconventional interests are found in at least 70% of people diagnosed with ASD which may be linked to defective development of brainstem or cerebellum in utero. Generally, autism is classified into three levels: (i) requiring support, (ii) requiring substantial support and (iii) requiring very substantial support. Children with ASD face difficulties with social reciprocation and interaction. It affects 1 in 68 children.
What causes autism?
ASD is a neurobiological condition in which brain development is affected by both genetic and environmental factors although no single primary cause has yet been identified.
Siblings of ASD patients are at a much higher risk, and the risk is significantly higher in monozygotic twins. Genes coding for proteins required at synapses or activity-dependent neuronal changes, or those involved in neuronal neurotransmission and inflammation have been linked to autism. Genetically, ASD is one of the most heterogeneous disorders.
The phenotypic expression of the genes linked to autism is very variable. Higher parental age increases the risk. Children of mothers suffering from autoimmune diseases, diabetes, infections while pregnant, or taking thalidomide and valproic acid are at a higher risk of ASD. In addition premature delivery, low birth weight and cesarean delivery are also speculated to be related to ASD.
Autism Spectrum Disorder and The Brain
People with ASD have been found to have differences in their cerebellar connectivity, abnormal limbic systems, changes in frontal and temporal lobe modifications, and reduced long-range and local connectivity. Increased cortical size and extra-axial fluid alongside defective neuronal differentiation and cortical formation is linked to ASD. Structural and functional changes in sensorimotor networks in cerebellum or cerebro-cerebellar regions in ASD patients have been linked to their repetitive behavioral patterns.
What is Music Therapy? Let’s learn.
Music therapy, as defined by the American Music Therapy Association, is the “clinical and evidence-based use of music interventions to accomplish individualized goals within a therapeutic relationship by a credentialed professional who has completed an approved music therapy program.” It is the application of musical stimuli to achieve non-musical outcomes: such as improvements in social, communication, cognitive and motor skills.
How Music affects the ASD brain
Music activates a large network of brain regions. Cortical and subcortical networks including the auditory complex, SMA, cerebellum etc. are associated with auditory perception and are involved in auditory perception. The right temporal lobe that controls speech; also processes the pitch of the music and so music can potentially also improve verbal communication. The experience-dependent nature of neuroplasticity can be used to “rewire” the structural changes in long range and local connectivity in ASD patients using music along with appropriate treatment. Music also evokes emotions and arousal. In short, music stimulates multiple cortical regions and develops cortical plasticity and functional connectivity.
Rhythms and the ASD brain
Rhythm is a core structural and organizational feature in music that divides time in a distinct order. Timing is also associated with natural and voluntary movements of the body. Synchronization of sensory and motor systems is important for higher cognitive functions. Primary speech production is also linked to brain rhythms. Entrainment to music requires attention, motor synchronization, and non-verbal coordination; thus stimulating an extensive brain network controlling vision, auditory and vestibular perception, and kinaesthesia. Speech, language and motor improvements have also been linked to rhythmic entrainment to music. Musical rhythm can be used to synchronize the otherwise disrupted central rhythm in ASD through cortical plasticity, thus improving stimulation and coordination.
Music Therapy and Autism Spectrum Disorder: An endocrinological perspective
Current research has shown that the ASD population has higher dopamine DRD3 receptors in the peripheral blood lymphocytes in response to molecular stimuli. This may provide a molecular basis for greater reward dimensions to a musical experience in people diagnosed with Autism Spectrum Disorder (ASD), although further research is needed to solidify this claim.
In 2020, more than 92,00 Americans died from drug overdoses. This was nearly a 30% increase from 2019, according to a report from the Centres for Disease Control and Prevention. From 2002 to 2017, there was a 22-fold increase in the total number of deaths involving fentanyl and other synthetic opioids and more than a 7-fold increase in the number of deaths involving heroin. Emergency department visits for suspected opioid overdoses rose by 30 percent in the U.S. from July 2016 to September 2017. The opioid crisis was declared a nationwide Public Health Emergency on October 27th, 201728.
Opioid Use Disorder (henceforth OUD), is defined as a problematic pattern of opioid use leading to clinically significant impairment or distress1. Opioid tolerance, dependence, and addiction are all manifestations of brain changes resulting from chronic opioid abuse2.
Symptoms of OUD3:
Using more of the drugs or using them longer than you intended
Inability to control or cut down use
Spend lots of time finding drugs or recovering from use
Stop or cut down important activities
Use while doing something dangerous, i.e. driving
Use despite physical or mental problems
Become tolerant – need more of the drug or need to take it more often
Have withdrawal – physical symptoms when you try to stop
The three primary brain regions involved in Opioid Use Disorders
The brain has many sections that are interconnected with one another, forming dynamic networks that are responsible for specific functions such as language, attention, reward, and emotion along with many other functions. There are three regions of the brain that are primarily involved with OUDs. The basal ganglia control the rewarding, or pleasurable, effects of opioid use and are also responsible for the formation of habitual opioid taking. The extended amygdala is involved in stress and responsible for the feelings of unease, anxiety, and irritability that typically accompany opioid withdrawal. The prefrontal cortex is involved in executive function (i.e, the ability to organize thoughts and activities, prioritize tasks, manage time, and make decisions), including exerting control over opioid use. These regions of the brain are easily able to be ‘hijacked’ by addictive substances.
The Basal Ganglia
The basal ganglia are a group of structures located deep within the brain that play an important role in keeping body movements smooth and coordinated. They are also involved in learning routine behaviours and forming habits. Two sub-regions of the basal ganglia are particularly important in opioid use disorders:
The nucleus accumbens, which is involved in motivation and the experience of reward
The dorsal striatum, which is involved in forming habits and other routine behaviors9
The Extended Amygdala
The extended amygdala and its sub-regions, located beneath the basal ganglia, controls the brain’s reactions to stress-including behavioural responses, i.e. ‘fight or flight’ and negative emotions such as anxiety and irritability. This region also cooperates with the hypothalamus; an area of the brain that controls activity of multiple hormone-producing glands, such as the pituitary gland at the base of the brain and the adrenal glands at the top of each kidney. These glands control reactions to stress and regulate many other bodily processes10.
The Prefrontal Cortex
The prefrontal cortex is located at the very front of the brain, and is responsible for complex cognitive processes called ‘executive function’. Executive function is the ability to organize thoughts and activities, prioritize tasks, manage time, make decisions, and regulate one’s actions, emotions, and impulses11,12.
Figure 3: The three stages of the addiction cycle and the brain regions associated with them.
Addiction can be described as a repeating cycle with three stages. This cycle involves three key drivers that motivate the neurobiological changes associated with opioid dependence:
Binge/Intoxication: the stage at which an individual consumes an intoxicating substance and experiences its rewarding or pleasurable effects. Positive emotions are produced, such as euphoria, and are positively reinforcing therefore increase the probability of using the drug in the early stage of addiction.
Withdrawal/Negative Affect: the stage at which an individual experiences a negative emotional state in the absence of the opioid. It is associated with negative reinforcement because of the desire to consume a drug in order to improve the affective state and to offset the withdrawal symptoms.
Preoccupation/Anticipation/Craving: the stage at which one seeks opioids again after a period of abstinence.
The three stages are linked to and interconnect to each other, however they also involve different brain regions, circuits, and neurotransmitters, resulting in specific kinds of changes in the brain. A person may go through this three-stage cycle over the course of weeks or months or progress through it numerous times in a day. There may be variation in how people progress through the cycle and the intensity with which they experience each of the stages. Nonetheless, the addiction cycle tends to intensify over time, leading to greater physical and psychological harm12.
The four behaviours central to the Cycle of Addiction:
Impulsivity: An inability to resist urges, deficits in delaying gratification, and unthoughtful decision-making. It is a tendency to act without foresight or regard for consequences and to prioritize immediate rewards over long- term goals13
Positive reinforcement. The process by which presentation of a stimulus such as a drug increases the probability of a response like drug taking.
Negative reinforcement. The process by which removal of a stimulus such as negative feelings or emotions increases the probability of a response like drug taking.
Compulsivity. Repetitive behaviours in the face of adverse consequences, and repetitive behaviours that are inappropriate to a particular situation. People suffering from compulsions often recognize that the behaviours are harmful, but they nonetheless feel emotionally compelled to perform them; doing so reduces tension, stress, or anxiety13.
Stage 1: Binge/Intoxication: Basal Galangia
The binge/intoxication stage of the addiction cycle is the stage at which an individual consumes the opioid substance. This stage heavily involves the basal ganglia and its two key brain sub-regions; the nucleus accumbens and the dorsal striatum (see above for their functions). In this stage, substances affect the brain in several ways.
The rewarding effects produced by the addictive substances positively reinforce their use and increase the probability of repeated use. The rewarding effects of substances include activity in the nucleus accumbens, including activation of the brain’s dopamine and opioid signalling system. Many studies have shown that neurons that release dopamine are activated by all addictive substances, but particularly by stimulants such as amphetamines, nicotine and cocaine (Figure 4)16. Furthermore, the brain’s opioid system, which includes naturally occurring opioid molecules (such as endorphins, enkephalins, and dynorphins) and three types of opioid receptors (i.e., mu, delta, and kappa), plays a key role in mediating the rewarding effects of other addictive substances, including opioids and alcohol. Activation of the opioid system by these substances stimulates the nucleus accumbens through the dopamine system. Brain imaging studies in humans show activation of dopamine and opioid neurotransmitters during alcohol and other substance use15,16.Studies show that antagonists (chemical substances that bind to and blocks the activation of certain receptors on cells, preventing a biological response) or inhibitors of dopamine and opioid receptors can block drug and alcohol seeking in both animals and humans.
Stimuli associated with addictive opioids can trigger opioid use; frequent use of a substance teaches the brain to associate the rewarding high with other cues in the person’s life, i.e. places where they use substances, the friends they take them with, and paraphernalia that accompany substance-taking. As these
prompts develop in association with the substance, the person may find growingly difficult not to think about using them. Over time, these stimuli can activate the dopamine system on their own and trigger powerful urges to take the substance. These “wanting” urges are called incentive salience and they can persist even after the rewarding effects of the substance have diminished (See fig.5).
The withdrawal/negative affect stage of addiction follows the binge/intoxication stage, eventually setting up for future rounds of binge/intoxication. During this stage, a person who has been using alcohol or drugs experiences withdrawal symptoms, the severity of which are dependent on:
Duration of drug use
How long the drug stays in your system
How healthy you are
Whether you’re quitting ‘cold turkey’ or using other drugs to help you stop taking opioids20
Symptoms commonly include:
The negative feelings accompanying withdrawal are thought to come from two sources: reduced activation in the reward circuitry of the basal ganglia and activation of the brain’s stress systems in the extended amygdala.
During long-term use, all substances cause disfunction in the brain’s dopamine reward system17. For example, brain imaging studies in humans with addiction have consistently shown long-lasting decreases in a particular type of dopamine receptor; the D2 receptor, compared with non-addicted individuals (see fig.6)18,19. Decreases
in the activity of the dopamine system have been observed during withdrawal from various substances, including stimulants, opioids, nicotine, and alcohol. Interestingly, other studies also show that when an addicted person is given a stimulant, it causes a smaller release of dopamine than when the same dose is given to a person who is not addicted.
These findings propose that people addicted to substances experience a general reduction in the sensitivity of the brain’s reward system both to addictive substances and also to natural pleasant sources, such as food; this because they also depend upon the same reward system and circuits. This explains why those who develop an opioid/substance use disorder often do not derive the same level of satisfaction or pleasure from once-pleasurable activities. This general loss of reward sensitivity may also account for the compulsive increase of substance use, as addicted individuals try to regain the pleasurable feelings the reward system once provided6. At the same time, a second process occurs during the withdrawal stage; the activation of stress neurotransmitters in the extended amygdala.
Studies propose these neurotransmitters play a crucial role in the negative feelings associated with withdrawal and stress-triggered substance use. In both animal and human studies, when researchers use antagonists to block activation of the stress neurotransmitter systems, it had the effect of decreasing opioid intake in response to withdrawal and stress. For example, blocking the activation of stress receptors in the brain reduced alcohol consumption in both alcohol-dependent rats and humans with an alcohol use disorder7. Thus, it may be concluded that an additional motivation for drug and alcohol seeking among individuals with OUDs is to suppress overactive brain stress systems that produce negative emotions or feelings. Research also suggests that neuroadaptations in the endogenous cannabinoid system within the extended amygdala contribute to increased stress reactivity and negative emotional states in addiction21.
The desire to eliminate the negative feelings that accompany withdrawal can strongly persuade to continue substance use. As previously mentioned, this motivation is strengthened through negative reinforcement, because taking the substance relieves the negative feelings associated with withdrawal, at least temporarily. Of course, this process is a vicious cycle; taking drugs or alcohol to diminish the symptoms of withdrawal that occur during a period of abstinence in fact causes those symptoms to be even worse the next time a person stops taking the substance, making it even more difficult to maintain abstinence.
The preoccupation/anticipation stage of the addiction cycle is when a person may begin to seek opioids again after a period of abstinence. Those with severe OUD may have a very short (hours) period of abstinence. In this stage, an addicted person becomes preoccupied with using substances again; commonly called craving. This stage of addiction involves the brain’s prefrontal cortex; the region that controls executive function. This is essential for a person to make appropriate choices about whether or not to use a substance and to ignore strong urges to use, especially when the person experiences triggers, such as stimuli associated with that substance or stressful experiences.
Go/Stop system analogy:
To explain how the prefrontal cortex is involved in addiction, we can divide the functions of the brain region into a ‘Go system’ and an opposing ‘Stop system’22. The Go system aids decision-making, particularly those that require attention and those involved with planning. People also engage the Go system when they begin behaviours that help them attain goals. Research shows that when substance-seeking behaviour is triggered by substance-associated environmental cues (incentive salience), activity in the Go circuits of the prefrontal cortex increases considerably. This increased activity stimulates the nucleus accumbens to release glutamate, the main excitatory neurotransmitter in the brain. This release encourages incentive salience, which creates a powerful urge to use the substance in the presence of drug-associated cues.
Another role of the Go system is to engage with habit-response systems in the dorsal striatum, contributing to the impulsivity associated with substance seeking. Habitual responding can happen automatically and subconsciously, meaning a person may not even be aware that they are participating in such behaviours. The neurons in the Go circuits of the prefrontal cortex rouse the habit systems of the dorsal striatum through connections that use glutamate (see fig.7).
Abbreviations; PFC – prefrontal cortex, DS – dorsal striatum, NAc – nucleus accumbens, BNST – bed nucleus of the stria terminalis, CeA – central nucleus of the amygdala, VTA – ventral tegmental area.
The Stop system inhibits the activity of the Go system. This system controls the dorsal striatum and the nucleus accumbens, the areas of the basal ganglia that are involved in the binge/intoxication stage of addiction. Specifically, the Stop system controls habit responses driven by the dorsal striatum, and evidence suggests it plays a role in reducing the ability of substance, in other words, it inhibits incentive salience23.
Additionally, the Stop system controls the brain’s stress and emotional systems, and plays an important role in relapse triggered by stressful life events or circumstances. Stress-induced relapse is driven by activation of neurotransmitters in the extended amygdala. As described above, these neurotransmitters are activated during lengthy abstinence during the withdrawal/negative affect stage of addiction. More recent work in animals also implicates disruptions in the brain’s cannabinoid system, which also regulates the stress systems in the extended amygdala, in relapse. Studies show that lower activity in the Stop component of the prefrontal cortex is associated with increased activity of stress circuitry involving the extended amygdala, and this increased activity drives substance-taking behaviour and relapse24.
Brain imaging studies in people with addiction show disruptions in the function of both the Go and Stop circuits24-,26. For example, people with alcohol, cocaine, or opioid use disorders display damage in executive function, including disturbance of decision-making and behavioural inhibition. These executive function deficits mirror changes in the prefrontal cortex and imply reduced activity in the Stop system and greater responsiveness of the Go system in response to substance-related stimuli12.
Different classes of drug affect the brain & behaviour in different ways
The three stages of addiction tend to apply to all addictive substances, however different substances affect the brain and behaviour in different ways during each stage of the addiction cycle. Variations in the pharmacokinetics of various substances determine the duration of their effects on the body and partially account for the differences in their patterns of use. For example, nicotine has a short half-life, which means smokers need to smoke often to maintain the effect. In contrast, THC, the primary psychoactive compound in marijuana, has a much longer half-life. Thus, marijuana smokers do not typically smoke as frequently as tobacco smokers27.
Specifically, opioids attach to opioid receptors in the brain, which leads to a release of dopamine in the nucleus accumbens, causing euphoria (the high), drowsiness, and slowed breathing, as well as reduced pain signalling; this is why they are often prescribed as pain relievers. Opioid addiction typically involves a pattern of:
The development of tolerance
Escalation in use
Withdrawal signs and symptoms
These symptoms include intense negative emotions and physical symptoms. As use progresses, the opioid must be taken to avoid the severe negative effects that occur during withdrawal. With repeated exposure to opioids, stimuli associated with the pleasant effects of the substances (e.g, places, persons, moods, and paraphernalia) and with the negative mental and physical effects of withdrawal can trigger intense craving or preoccupation with use12.
10 Davis, M., Walker, D. L., Miles, L., & Grillon, C. (2010). Phasic vs sustained fear in rats and humans: Role of the extended amygdala in fear vs anxiety. Neuropsychopharmacology; https://www.nature.com/articles/npp2009109
26Goldstein, R(2002). Drug addiction and its underlying neurobiological basis: Neuroimaging evidence for the involvement of the frontal cortex, American Journal of Psychiatry; https://pubmed.ncbi.nlm.nih.gov/12359667/
Fig1: Substance Abuse and Mental Health Services Administration (US); Office of the Surgeon General (US). Facing Addiction in America: The Surgeon General’s Report on Alcohol, Drugs, and Health [Internet]. Washington (DC): US Department of Health and Human Services; 2016 Nov. Figure 2.2, Areas of the Human Brain that Are Especially Important in Addiction. Available from: https://www.ncbi.nlm.nih.gov/books/NBK424849/figure/ch2.f2/
Fig3: Substance Abuse and Mental Health Services Administration (US); Office of the Surgeon General (US). Facing Addiction in America: The Surgeon General’s Report on Alcohol, Drugs, and Health [Internet]. Washington (DC): US Department of Health and Human Services; 2016 Nov. Figure 2.3, The Three Stages of the Addiction Cycle and the Brain Regions Associated with Them. Available from: https://www.ncbi.nlm.nih.gov/books/NBK424849/figure/ch2.f3/
Fig4: Substance Abuse and Mental Health Services Administration (US); Office of the Surgeon General (US). Facing Addiction in America: The Surgeon General’s Report on Alcohol, Drugs, and Health [Internet]. Washington (DC): US Department of Health and Human Services; 2016 Nov. Figure 2.5, Actions of Addictive Substances on the Brain. Available from: https://www.ncbi.nlm.nih.gov/books/NBK424849/figure/ch2.f5/
Fig5: Substance Abuse and Mental Health Services Administration (US); Office of the Surgeon General (US). Facing Addiction in America: The Surgeon General’s Report on Alcohol, Drugs, and Health [Internet]. Washington (DC): US Department of Health and Human Services; 2016 Nov. Figure 2.6, Major Neurotransmitter Systems Implicated in the Neuroadaptations Associated with the Binge/Intoxication Stage of addiction. Available from: https://www.ncbi.nlm.nih.gov/books/NBK424849/figure/ch2.f6/
Fig6: Substance Abuse and Mental Health Services Administration (US); Office of the Surgeon General (US). Facing Addiction in America: The Surgeon General’s Report on Alcohol, Drugs, and Health [Internet]. Washington (DC): US Department of Health and Human Services; 2016 Nov. Figure 2.8, Time-Related Decrease in Dopamine Released in the Brain of a Cocaine User. Available from: https://www.ncbi.nlm.nih.gov/books/NBK424849/figure/ch2.f8/
Fig7: Substance Abuse and Mental Health Services Administration (US); Office of the Surgeon General (US). Facing Addiction in America: The Surgeon General’s Report on Alcohol, Drugs, and Health [Internet]. Washington (DC): US Department of Health and Human Services; 2016 Nov. Figure 2.11, Major Neurotransmitter Systems Implicated in the Neuroadaptations Associated with the Preoccupation/Anticipation Stage of Addiction. Available from: https://www.ncbi.nlm.nih.gov/books/NBK424849/figure/ch2.f11/
Donating blood is common and a highly beneficial thing to do. Many selfless individuals donate their blood yearly which benefits sick patients all around the world. Each year, an estimated 6.8 million people in the United States alone donate blood, resulting in their collection of 13.6 million units of whole blood and red blood cells. However, many often question how exactly this donated blood is collected, processed, stored, and used which is exactly what this article aims to investigate.
Types of Blood Donations
The standard types of blood donation must be first understood before answering such a question because the type of blood being donated heavily influences the method in which it is collected. Beginning with a standard blood donation, this consists of plasma, red and white blood cells, platelets, antibodies, and other components. Plasma, also known as plasmapheresis, is another type of donation that separates plasma from the other components. This is done using a special machine and red blood cells are returned to the donor in cycles throughout the donation. Plasma is often given to individuals in emergency and trauma situations to help stop bleeding. Lastly, platelet donation–or plateletpheresis–is conducted in a similar way to plasma donations, but red cells and plasma are returned to the donor. These are typically given to individuals with clotting problems, cancer patients, or those who will have organ transplants or major surgeries. Additionally, a less common donation includes autologous which occurs before operation or transfusion where a person donates blood for their use. A directed or designated donation is also possible where a donor gives blood that is intended for a specific person to use.
How Much Blood Can Be Given?
It is crucial to understand exactly how much blood an individual can give before determining collection procedures as well. Healthy adults aged 18-75 are especially encouraged to donate as well as those with O negative blood because it can be given to anybody–of any blood type–if necessary. Regular donations collect around 470 mL of whole blood, which is 8% of an adult’s average blood volume. The body replaces this amount within 24 to 48 hours and replenishes red blood cells in 10 to 12 weeks. Additionally, individuals can donate whole blood every 12 weeks and plasma every 2 weeks.
How Is It Collected?
Now that the necessary basic information on blood collection is known, the specifics on how exactly it is collected can be investigated. First, pre-screenings are conducted to ensure the donor’s blood is healthy and will not damage a recipient. All donated blood is screened for blood-borne diseases–HIV, Hepatitis, Syphilis–and the donor must not suffer from a cold, flu, or any other illness at the time of donation. They must weigh at least 50kg and have a normal temperature and blood pressure. In the times of COVID, they must also be screened for the virus.
Taking blood is also a relatively simple process for regular blood donations. A blood pressure cuff or tourniquet is placed around the upper arm to fill a donor’s veins with more blood. A phlebotomist will then cleanse the area on the arm with an antiseptic and insert a sterile needle for the blood draw. This sterile needle is attached to a thin plastic tube and a blood bag. Once the needle is in, the donor should tighten their fists several times to help blood flow from the vein. Initially, blood is collected into tubes for testing and is then allowed to fill the blood bag. This process takes about 8 to 10 minutes and can be done sitting upright or laying down. When approximately a pint of blood is collected, the donation is complete and staff will remove the needle, elevate the donation arm, and apply slight pressure to promote clotting before bandaging the arm.
The process for platelet donations is however slightly different. Though the same pre-screening and sterile needle method still exist, a needle and plastic tube are connected to both arms. For such donations, an apheresis machine is used which collects a small amount of blood and separates the red cells, plasma, and platelets. It then returns the rest of the blood through the donor’s other arm and this cycle is repeated several times over about 2 hours.
How Is Blood Transported Post-Draw?
Directly after blood is taken, the donation, test tubers, and donor records are labeled with an identical bar code label. The donation is kept in ice before being taken to a processing center and the test tubes go to the lab. At the processing center, information about the donation is scanned onto a computer database before further steps are taken to prepare the blood to be shipped out to hospitals. Most whole blood donations are spun in centrifuges to separate them into transfusable components–red cells, platelets, plasma. Plasma may be processed into components–cryoprecipitate–which helps control the risk of bleeding by helping blood to clot. The red cells and platelets are leuko-reduced, meaning white blood cells are removed to reduce the participant’s possibility of reacting to transfusion. Each component is then packaged as a “unit” which is a standardized amount that doctors will use when transfusing a patient.
In parallel with all of this, the test tubes arrive at a testing laboratory where a dozen tests are performed to establish the blood type and test for infectious diseases. The test results from this are electronically transferred to a processing center within 24 hours and if results are positive, the donation is discarded.
When test results are received, units suitable for transfusion are labeled and stored. Specifically, red cells are stored in refrigerators at 6 degrees Celsius for up to 42 days, platelets are stored at room temperature for up to 5 days, and plasma and cryo are frozen and stored in freezers for up to 1 year. When blood is needed at hospitals, it is available for distribution 24/7. Hospitals typically keep blood units on their shelves, but some often call to receive units in times of urgency.
How Does Blood Transfusion Occur? What Products Are Made From Blood?
After the blood donation has reached the hospital, it is removed from storage and given to a patient in need through an intravenous line. This is a tiny tube that is inserted into a vein with a small needle and the transfusion process takes about 1 to 4 hours in total depending on the amount of blood needed.
Additionally, specific products can also be made from blood. From whole blood, red cells can boost the oxygen-carrying abilities of a patient’s blood. Platelets can clot blood assisting in those recovering from a severe hemorrhage. Lastly, plasma can be used in treating people with burns, cancer, and protecting people with brain and nerve diseases. Plasma contains antibodies and other important proteins used to make human immunoglobulin, human albumin, human coagulation factor IX, and many other specific products.
Blood donations travel a long way and though it only takes a couple of minutes to donate whole blood, a pint of blood could save a life in need. Due to the lack of donations during the pandemic, blood donations are at an all-time low; thus, individuals should donate blood if they can and are healthy enough to do so.
The human immune system does a remarkable job protecting us from foreign invaders, such as viruses and bacteria . Over the last decade, the field of cancer immunology has made tremendous progress in elucidating the relationship between the immune system and cancer cells . Research has shown that cancer cells can effectively “hide” themselves from the immune system allowing them to grow unperturbed . Proteins expressed on the cell membrane of cancer cells serve as “stop” signals for our immune system preventing immune cells from killing cancer cells . This discovery has enabled scientists to create a new class of drugs called immunotherapeutic, which block these “stop” signals from the cancer cell, thereby allowing our immune system to selectively find and kill cancer cells. While chemotherapy kills cancer cells and healthy cells causing widespread damage to cancer patients, immunotherapeutics not only acts as an effective treatment but also causes less damage to the patient . Immunotherapy is the future for cancer treatment and holds significant promise in taking us towards a cure for cancer . This review paper will describe how the human body’s immune system works, how cancer evades our immune system, and how immunotherapy restores the immune system’s function.
1.1 – What Is Cancer?
Cancer is arguably the most frightening disease to be diagnosed and despite its long-standing presence in human history, society continues to race towards the perfect treatment. Data from 2017 showed that almost 10 million people died from cancer, and it is estimated that approximately 2 million people will be diagnosed with cancer in 2021 . Though cancer occurrences are rare, it is the second-largest global killer of humans right below cardiovascular disease . Due to its prevalence and deadly impact, cancer is an exceedingly pressing issue.
While viruses and bacteria are foreign organisms to the host, cancer cells are unique in the sense that they originate from healthy cells in the body . There are countless ways in which cancer cells can be formed, but there is an overarching theme to their genesis, which begins with the cell cycle . The cell cycle is the process by which cells reproduce through cell division . There are four stages in the cell cycle: G1, DNA Synthesis, G2, and mitosis . The most important stage in the development of cancer is DNA synthesis . Healthy cells create two copies of identical, healthy DNA after DNA synthesis, but sometimes there is an error made during DNA synthesis which can mutate or “break” the DNA being synthesized . Once DNA mutates up to a certain point, it can lose the ability to perform essential functions such as telling cells when to stop dividing . When this happens, cancer cells form . Cancer cells are cells that lose the ability to stop dividing . As a result, cells become malignant and begin to spread throughout the body . In fact, the term “cancer” acts as a placeholder, indicating various types of uncontrolled cell growth . This occurs because the gene, a specific part of our DNA, which would normally tell cells to stop dividing has mutated and lost its ability to do so . Cancer diagnoses have different stages of severity . If the cancer is localized to one part of the body,
The odds of surviving are much better than if the cancer has spread out to other parts of the body . Ultimately, to effectively treat cancer, it must be stopped at every stage.
1.3. – Types Of Cancer
Cancer is not a single disease because cancer cells can arise in different parts of the body . Moreover, different cancers have different survival rates . For instance, melanoma is the abnormal growth of skin cells, and glioblastoma is the growth of abnormal brain cells . Glioblastoma is one of the deadliest types of cancer, with most patients dying within 2 years of diagnosis . Melanoma, if caught early enough, can have survival rates as high as 100%, simply because the cancerous cells can be cut out before they begin to cause serious problems . In addition, patients with common cancers (e.g., breast, skin, prostate, etc.) are known to have a higher chance of survival  and this is mainly due to doctors’ increased familiarity with these types of cancer, as well as the advancement of equipment and technology used to fight these diseases . However, some cancers are far more lethal. In fact, Karen Selby from The Mesothelioma Center states, “According to the American Society, lung cancer – and lung cancer caused by asbestos – is the number one killer, with 142,670 estimated deaths in 2019 alone…” Lung, tracheal, and bronchial cancer have claimed the highest number of lives, with a devastating “1.9 million in 2017” (Our World In Data) .
1.4 – Risk Factors
Getting cancer can be due to poor genes, an unhealthy lifestyle, or bad luck . If mutated genes are passed down from parent to child, the likelihood of having cancer increases . If one smokes, drinks, and has poor eating habits, they can introduce mutations to the body which can mutate the DNA and cause cancer . Smoking, for example, introduces a significant number of mutagens to the body which mutates the DNA and causes extreme damage to cells, making them cancerous . Lastly, a random error made during DNA synthesis can be enough to cause cancer . Although the last cause is very rare, it can happen.
2 – Current Research
2.1 – Treatments
Given the deadly nature of cancer, there has been a lot of research into diagnosing and treating cancer. In order to determine the presence of cancer, various methods such as physical exams, laboratory tests, imaging tests, and biopsy are used . When a patient has been diagnosed with cancer, a doctor can proceed in various ways, based on the threat level and current stage . Depending on the type of cancer, doctors can select the correct treatment in order to handle the disease . This may include curative, primary, adjuvant, and palliative treatment . A cure occurs when the doctor removes any possibility of cancer cells returning, allowing the patient to live a normal life . Such a treatment is highly unlikely with cancer and depends heavily on the patient’s circumstances . A much more practical treatment would be the primary treatment, wherein the doctor removes all cancer cells from the patient’s body, typically by surgery . However, even when cancer cells are removed, there is a high chance of cancer continuing to grow, underlining the importance of adjuvant treatment . Adjuvant treatment is an attempt to exterminate any cancer cells that reappear. . In some cases, there are also times where it is not possible to terminate the cancer cell’s growth . During these times, the most effective treatment is a palliative treatment, which focuses on alleviating a patient’s symptoms when remission is not possible .
2.2 – Chemotherapy
One of the most common treatments for cancer is chemotherapy . Chemotherapy consists of the implementation of various drugs, used to prevent cancer cells from growing and dividing . The treatment is most effective after the formation of large tumors that have begun to spread rapidly and cannot be effectively removed through surgery . While radiation and surgery target specific areas of the body, chemotherapy can treat the entire body . Unfortunately, chemotherapy has major side effects . As chemotherapy drugs work their way through the body, they can affect other fast-growing healthy cells, which include the skin, hair, intestines, and bone marrow . Due to the potential for major complications, chemotherapy is not a perfect treatment, which is why scientists have been furiously working in order to create new ways of treating cancer.
Figure 2: Effects of Chemotherapy on the human body
The human immune system helps to fight off invaders, such as viruses and bacteria, in the body . However, cancer is often able to grow uncontrolled and unperturbed by our immune system . A common cold can be healed in a matter of days with little to no medicine, as our immune system eradicates the virus from our bodies . However, with cancer, medicines like chemotherapy and radiation treatment are almost always necessary to eradicate cancer cells . To understand why the body is often unable to fight cancer cells, it is important to understand how the human immune system works.
The immune system is an umbrella term for a set of specialized cells that defend the human body against viruses, bacteria, and fungi . Humans have two types of responses to any foreign invader: innate and adaptive . The innate response is the first response from the immune system . It is what causes the redness and inflammation associated with cuts on the skin, and it is responsible for the first round of destruction of foreign invaders . The adaptive response is a more specialized and coordinated defense against foreign invaders . It’s responsible for identifying previously fought infections like viruses and producing neutralizing molecules, called antibodies, that are mobilized against previously fought infections .
3.2 – Organs of The Immune System
Each of the body’s systems has specialized organs and cells, and the immune system is no different. Much like how the cardiovascular system is composed of the heart, veins, and arteries, the immune system also comes with a specialized set of organs . Bone marrow, thymus, tonsils, spleen, and lymph nodes are hubs where immune cells are born, matured, and kept in reserve until they are needed to fight off foreign invaders . Moreover, the human body comes equipped with an external barrier, making it difficult for viruses, bacteria, and fungi to harm the skin . The best way of fighting germs is to make sure they do not enter the body in the first place. The skin is the first line of defense against germs, but if the skin gets cut, there are immune cells lying in wait underneath the surface ready to attack whatever comes in . Moreover, certain reflexes like coughing and sneezing help the body expel germs that may be trying to get into the airways .
The immune system also comes with its own specialized set of cells . These include cells such as lymphocytes, neutrophils, and macrophages . Lymphocytes are cells that take part in the adaptive immune response and help the body fight off viruses . These cells originate in the bone marrow . Some move to the thymus where they mature further and become T-cells, whereas others stay in the bone marrow and become B-cells . T-Cells can be considered as the United States Marines: they need to be called upon to a site of danger, they kill selectively while leaving non-intruders unharmed and they inform other members of the group of the invaders they attacked and what the invaders look like . T-cells effectively search and destroy whatever is not supposed to be in the body . B-cells act like the FBI of the immune
system . They keep a record of invaders the body has fought off and they deploy specialized molecules called antibodies to help neutralize invaders and help alert T-cells to a site of danger . Neutrophils and macrophages are a part of the innate immune system . They are both responsible for engulfing foreign invaders like bacteria and small microbes and preventing them from causing mass havoc in the body . If the innate response is not sufficient to get rid of invaders, then an adaptive response arrives in order to act upon the virus .
3.4 – Receptors and Ligands
Communication is extremely important for immune cells . Immune cells need to be able to communicate with one another about potential invaders and the actions they take against them . Cells communicate with each other through molecules known as receptors and ligands . Receptors are proteins that generally reside on the outer membrane of cells exposed to the environment and ligands are the molecules which attach to their host receptor much like a lock and key mechanism . There is one key (a ligand) for a lock (a receptor) . Immune cells “know” what to do depending on which ligand and receptor come into contact. If the ligand is unfamiliar to the immune cell, the immune cell knows that it is a foreign ligand and the cell expressing this foreign material must be destroyed because it may be harmful to the body . Receptor-ligand interactions can be stimulatory, meaning they can stimulate immune cells to go out and attack invaders, or they can be immunosuppressive, meaning they can “turn off” an immune response . To properly regulate our immune response, healthy cells must express immunosuppressive ligands so that the body’s immune system does not attack itself, and foreign invaders express stimulatory ligands that prompt the immune system to attack .
Receptors are crucial for immune cells to determine the difference between both an unhealthy and healthy cell . Ligands known as non-self-antigens (bacteria/fungi) trigger the immune response, resulting in the invader being attacked . It’s extremely important for the immune system to be able to identify the difference, as unhealthy cells are detrimental to an individual’s health, while healthy cells allow the body to function correctly .
Figure 4: Overview of The Immune System-Reaction of the Killer T-Cell after recognizing a healthy and infected cell (receptor + antigen)
3.5 – Cancer Hiding from The Immune System
Although the immune system can protect us from foreign invaders, it seemingly has trouble protecting us from cancer cells . Healthy cells in our bodies know all the neat tricks our bodies use to protect themselves from the immune system . Cancer cells, developed from once healthy cells, maintain their knowledge of how to hide from the immune system . This evasion from the immune system allows cancer cells to grow virtually undisturbed in the human body . Cells in our bodies use receptors and ligands to communicate with the immune system . One “hiding” mechanism cancer cells use is the PD-1/PD-L1 axis . PD1 (programmed death 1), is a receptor found on T-cells . PD-L1 (programmed-death ligand 1) is a ligand found on many healthy cells in human bodies (such as placenta cells) as well as cancer cells . When PD L1 and PD-1 come into contact, the T-cells that express PD-1 are essentially “turned off.” .
When PD-L1 expressed by cancer cells encounter PD-1 expressed by T-cells, the cancer cells effectively “turn off” and prevent the T-cells from attacking them . This process is incredibly effective in preventing an immune response and has been a source of drug-discovery efforts . If this process can prevent T-cells from attacking cancer cells, then this can be drugged and allow the immune system to attack cancer cells .
Figure 6: Cancer Cells Preventing the Immune Response – Reaction of cancer cells, passing the immune system checkpoint using the PDL1 pathway
4.1 – What Is Immunotherapy
Recent research and clinical success have shown that our bodies are best for fighting cancer . Immunotherapy has made remarkable progress in the last decade, showing a great deal of promise for the future . According to the CRI (Cancer Research Institute), “Cancer immunotherapy, also known as immuno-oncology, is a form of cancer treatment that uses the power of the body’s own immune system to prevent, control, and eliminate cancer .” In simpler terms, immunotherapy allows the immune system to attack cancer cells, using additional help from outside sources . An example mentioned before consisted of the PD1 pathway, with the cancer cell emitting PDL1, allowing it to evade the PD1 receptor from the Killer T-Cell . With immunotherapy, it is possible to block the PD1 receptor, leaving the cancer cell defenseless against the Killer T-cell . The treatment can come in various forms, such as antibodies, cancer vaccines, adoptive cell transfer, tumor-infecting viruses, checkpoint inhibitors, cytokines, and adjuvants . This treatment is also known as a form of biotherapy (BRM therapy) . Genetic engineering, known as gene therapies, may also be used to support the fighting capabilities of
immune cells . Immunotherapy is the science of using our immune system to treat diseases .
Figure 7: Drugs Blocking Receptor and T-Cell Killing Cancer Cells –As a result of the drug blocking the PD1 pathway, the Killer T-Cell can release its molecules, attacking the cancer cell
Figure 8: Anti-PD1 antibody immunotherapy (purple and green) binding to PD-1 (gold) – Immunotherapeutic agents act as pseudo-ligands and prevent real ligands from binding to their receptors of choice. Anti-PD1 physically blocks PD-L1 from binding to PD-1 thereby instigating an immune response against cancer cells
Though advancements have been made in chemotherapy, it is important to know the differences between immunotherapy and chemotherapy. Chemotherapy acts mainly indiscriminately killing both cancer cells and healthy cells . Moreover, there is a possibility of the cancer cells returning later which may result in multiple rounds of treatment . Immunotherapy, on the other hand, allows the immune system to selectively kill cancer cells while leaving non-cancerous cells intact–for the most part . This means that our immune system is monitoring our bodies 24/7 to search and destroy newly arising cancer cells , whereas Chemotherapy focuses on cells that rapidly divide, targeting both cancerous and non-cancerous cells, leading to loss of skin cells, hair, and bone marrow cells. Immunotherapy generally comes with much milder side effects . Common side effects include skin irritation around the injection site, fever/chills, rashes, and diarrhea . As with all medicines, it is ultimately up to the physician to decide what is in the best interest of the patient’s health . Given the type of cancer, its severity, and the patient’s biology, one form of treatment may be better than the other .
4.3 – Future of Immunotherapy
Immunotherapy has already been approved in the United States as well as all around the world in order to treat a variety of cancers . As scientists and doctors continue to realize the prominence of immunotherapy, strong advances continue to be made in the field . The CRI (Cancer Research Institute) states, “As of June 2017, the U.S. Food and Drug Administration (FDA) has approved 32 different immunotherapies for patients with cancers…”  Thirty-two
drugs have already been approved, showing how much the field of oncology has invested in this treatment . Immunotherapy is typically used with other treatments (e.g., chemotherapy/radiation therapy). As research on the treatment has only recently started, doctors, as well as patients, do not consider it as the primary option . In fact, doctors typically suggest the treatment only after other curative treatments have been exhausted. Bob Tedeschi from STAT
(American health-oriented news website) explains that “Immunotherapies work for only around 15 to 20 percent of cancer patients who receive them”, which is partly the reason why they are not prescribed at first . Immunotherapy has a lot of potential, yet at its current stage, it can only benefit a small percentage of patients. However, scientists are still determined to make improvements to the treatment . The current focus in immunotherapy research is moving immune cells to the exact location of cancer cells so that immunotherapy can function more effectively . This treatment shows a lot of potential, yet more studies must be done for it to benefit thousands of lives that are impacted by cancer . It is important for patients to determine with their oncologist which treatment is the best option for them . However, with the incredible advances already shown, it’s natural to expect the prominence of immunotherapeutic in the future .
Cancer is one of the many hallmarks of the human experience. Chances are that each of us knows or has heard of someone who has died from this disease. Given the medical advancements and a higher level of public awareness, we are now able to diagnose cancer early thereby increasing the survival rate of many people . However, some cancers do not display symptoms until they have become malignant and spread to the rest of the body . Chemotherapy, once regarded as the only treatment for cancer, has begun to lose its spotlight given the success of tumor immunology and immunotherapy . Immunotherapeutic drugs allow the body to kill cancer cells without harming healthy cells as well as keeping a lookout for any new cancer cells that may arise in the body . Even in its infancy, immunotherapy has effectively treated a significant number of cancer patients . Although much is yet to be known about immunotherapy and though it is not foolproof, it holds significant promise as a safe and effective treatment for all cancer types, bringing us closer to a cure.
I would like to thank Polygence for giving me this incredible opportunity, as well as my mentor Nooriel Banayan who always encouraged me and was with me through every step. Without the knowledge gained through countless lessons, this project wouldn’t have been possible. This paper is for anyone who is not up to date with the progress made in cancer treatments and focuses on immunotherapeutic as the future. The cover photo was from png find, https://www.pngfind.com/mpng/hiRoixi_human-body-transparent-background-transparent background-transparent-human/.
Sailesh Gunaseelan, Youth Medical Journal 2021
 2 Minute ReadMedically Reviewed by UPMC Hillman Cancer Center
October 28, 2020. (2020, October 30). Chemotherapy vs.
 Mayo Foundation for Medical Education and Research. (2021, February 17). 7 healthy habits that can reduce your risk of cancer. Mayo Clinic. http://www.mayoclinic.org/healthy lifestyle/adult-health/in-depth/cancer-prevention/art-20044816.
 Survey: Only 14% of Americans are worried about the leading Cancer Killer. Mesothelioma Center – Vital Services for Cancer Patients & Families. (n.d.). http://www.asbestos.com/featured stories/cancers-that-kill-us/.
Autism Spectrum Disorders (ASD), characterized by deficits in social communication and restricted repetitive behaviors, is a neurodevelopmental disorder affecting approximately 1 in 54 children. Differences in producing and processing language are common in those with ASD and impact social development. Usage of communication tools, including gestures and cues such as gaze, are affected in children with ASD. In addition to behavioral differences, the main neuroanatomical differences in those with ASD that affect language are cortical thickness differences and differences in the amount of gray matter. Autistic children often reach developmental milestones later than typically developing children, and they often have flatter growth in early childhood with steeper growth in adolescence. They also show reduced gestures with a later onset and differences in cortical thickness.
Autism Spectrum Disorders is a neurodevelopmental disorder characterized by deficits in social communication and restricted repetitive behaviors. According to the Centers for Disease Control and Prevention, almost 1 in 54 children can be reliably diagnosed with ASD by the age of two (2020a, 2020b). According to the National Institute of Mental Health, parents and educators request a diagnosis for any abnormal development seen in children. This is followed by an evaluation by a group of doctors or other health professionals like speech-language pathologists with experience in diagnosing ASD (2018).
There is high heterogeneity in autistic people with the symptoms of ASD, including language. Due to this variety, delayed development of language is common but not in all autistic individuals. Evidence supports the fact that language develops separately from autistic traits (Gernsbacher et al., 2016). In autistic individuals, there is often reduced language and the onset of language is often delayed. The delayed onset of language is the most pressing concern for autistic children’s parents (Mayo et al., 2013). Echolalia, repeating expressions word to word, and pronoun reversal, using I when you is meant, are common in autistic individuals (Gernsbacher et al., 2016).
This review will address the ages at which autistic individuals reach developmental milestones and the different developmental trajectories that the language of autistic children follow social communication differences, and neuroanatomical differences affecting language.
Developmental Milestones and Trajectories
Throughout childhood development, there are many language milestones to reach: speech-like production, first words, first phrases, and understanding. Children with ASD often reach these milestones at a later age than typically developing (TD) children. Some children with ASD do not develop expressive language (language production), but this review will only address those who do.
Young children with ASD often demonstrate reduced sensitivity to human voices. Specifically, they appear unable to orient or respond to vocal cues, even though they show responsiveness to other non-vocal stimuli (Sperdin and Schaer, 2016). They also show unresponsiveness to their name (Figure 1).
Figure 1 (Sperdin and Schaer, 2016) |Early language delays and impairments (unresponsiveness to name, delayed canonical babbling, more non-speech productions, less speech-like vocalizations, the delayed occurrence of the first words) that indicate a higher risk of ASD are shown by the red dashed rectangles.
Typically developing infants reach the language milestone of canonical babbling at around 10 months of age. Canonical babbling is strings of syllables that include a vowel and a consonant. Young children with ASD (aged 16-48 months) show low canonical babbling production even when matched for expressive language subgroups. A study reviewed home videos of children from birth to two years, and concluded that typically developing children were more likely than children with ASD to have reached the canonical babbling stage at both the age ranges 9-12 months and 15-18 months. In addition, the infants who were later diagnosed with ASD demonstrated significantly lower total vocalizations than typically developing infants (Patten et al., 2014).
Studies of human language development have taught us that typically developing children form their first words at an age of 12-18 months. By comparison, children with ASD typically develop their first words at an average age of 36 months. A study referring to retrospective parent reports on the age of first words showed that when comparing children who had and had not produced first words by a benchmark age, the verbal children consistently scored higher on cognitive assessments (Mullen Scales of Early Learning Visual Reception, Expressive and Receptive Language), and communicative skills (Vineland Adaptive Behavior Scales Communication domain) and lower on autism severity (Childhood Autism Rating Scale total score) even though the group performance of all the autistic children was still below average (Mayo et al., 2013).
Another study involving retrospective parent reports on the age of onset of the first phrases showed that the group with phrase speech by 24 months achieved higher levels of verbal ability, nonverbal ability, and IQ at school age. The group was also less likely to have sentence repetition deficits and impairments in adaptive communication (Kenworthy et al., 2012). Another study concluded that those with earlier first words had higher levels of language production in the future and, after controlling for the age of first words, those who reached the milestone later gained adaptive skills faster. Although the age of first words predicted expressive language and adaptive skills, it did not predict receptive language (language understanding) or nonverbal cognition (Kover et al., 2016).
It was also demonstrated in other studies that children with ASD show more impairment in receptive language, and that expressive language might proceed ahead of receptive language in children with autism. Three-year-olds with ASD showed reduced initiations, use of interrogative questions (who, what, when, where, and why), and responses to parents even though their parents initiated interactions with their children just as often as other parents. In comparison, toddlers with language delays also spoke less, but they had the same rates of responses to parents, use of interrogative questions and interactions initiated as typically developing children. This supports the fact that toddlers with ASD struggle more with the social aspects of language than they do with rules like grammatical markings (Bacon et al., 2019).
A behavioral study designed to assay social feedback in the context of language came to the conclusion that compared to typically developing children, children with autism produce fewer speech-related vocalizations, and responses are less dependent on whether the cue vocalizations are speech-related. In both typically developing and autistic children, the subsequent vocalization was more likely to contain speech-related material if the previous speech-related vocalization received an adult response. (Warlaumont et al., 2014). Highlighting the significance of the responses of adults, another study addressed the relationship between adult’s expressions and attention to child vocalization rate. Previous behavioral work with young TD children suggests that a parent’s still face, demonstrating a withdrawn state, tends to increase vocalization rate as infants wish to engage their withdrawn parent. The question of whether this is the same case in children with autism is yet to be answered (Patten et al., 2014). The reduced contingency of the responses of the autistic children’s parents provides fewer opportunities for the children to learn and later improve their language. In the future, improving the contingency of response on speech-relatedness of vocalization can help improve the language of the children (Warlaumont et al., 2014).
A study that included 6 home visits with a 30-min semi-structured parent-child play session showed that the autistic children with higher verbal skills (ASD-HV) had a growth trajectory similar to that of typically developing children on several language measures, while those with lower verbal skills (ASD-LV) had flatter trajectories. ASD-HV children and typically developing children showed improvement in most language measures over time, whereas ASD-LV kids demonstrated a lower level of progress. ASD-LV children increased only in total utterances and in five of the 14 grammatical morphemes (Tek et al., 2014).
Overall, the developmental trajectories of children with autism are flatter than those of typically developing children; however, there is steep growth observed at later time points. When compared with typically developing children with a similar size vocabulary, the vocabularies of children with ASD contain similar relative proportions of words from different grammatical and semantic categories. There is also an overlap in the words spoken most often. Still, the heterogeneity of language in children with autism must be emphasized (Gernsbacher et al., 2016).
A common finding among several studies is that gesture comprehension and gesture production plays a role in the later development of spoken language. Early-onset of gesture production was significantly correlated with higher language abilities later in life. Brooks and Meltzoff (2008) explain that gestures provide infants with a communicative tool that prompts caregivers to say the name of the object the toddler is referring to. It has been shown that infants vocalize more frequently when pointing with the index finger rather than the whole hand, which is practical because the declarative motive is to share a thought about the object being referred to. While typically developing infants begin to point around 9-12 months, children with ASD show a delay in the onset of gesture production (Ramos-Cabo et al., 2019).
Children with ASD had significantly lower gesture rates than typically developing children, and pointing gestures were negatively affected the most. In addition, the children with ASD showed a lower variety of gestures (Ramos-Cabo et al., 2019). An outcome of primarily nonverbal communication at an age of 3 years was best predicted by a deficit in understanding, rate of communication acts for behavior regulation, inventory of conventional gestures, consonant inventory, inventory of conventional play actions, and stacking blocks in children with ASD between 18 and 24 months of age. Verbal communication skills at 3 years of age were best predicted by acts for regulating another person’s behavior, and inventory of consonants (Wetherby et al., 2007).
Evidence shows that children with ASD do not differ significantly from TD children in the first year of life in social-communication skills, possibly reflecting a developmental trajectory where ASD emerges as a behavioral decline in social engagement in the second year (Ramos-Cabo et al., 2019). It could be noted here that, according to the Centers for Disease Control and Prevention, by age two, a diagnosis of ASD can be considered reliable, although many children do not get diagnosed until approximately four years of age (2019, 2020a).
Through analyzing home videotapes of first birthday parties, Osterling and Dawson (1994) discovered that at 1 year of age, a lack of pointing, showing, looking at the face of another, and orienting to name differentiated autistic children from TD children. Additionally, in a follow-up study, decreased use of gestures and increased repetitive behaviors were found to have separated autistic children from TD children, but not from developmentally delayed (DD) children (Wetherby et al., 2007). In another study, Wetherby et al. (2004) found that children with ASD differed significantly from TD and DD children by demonstrating a lack of appropriate gaze, lack of warm, joyful expressions with gaze, lack of sharing enjoyment or interest, lack of response to name, lack of coordination of gaze, facial expression, gesture, and sound, lack of showing, usual prosody, repetitive body movements, and repetitive movements of objects. The children with ASD differed significantly from the TD group but not the DD group with a lack of pointing, lack of vocalizations with consonants, and lack of conventional play with toys.
In a study assessing eye gaze, autistic children were presented with images of faces below two candy bars, one looking forward and the other looking at a candy bar. The children were able to see the gaze shift but were unable to answer which candy bar the face wanted, suggesting that they were unable to take the meaning from the direction of the gaze (Redcay, 2008). Gaze interpretation is not automatic in autistic children. In the terms of the autistic children’s eye gaze, they show fewer gaze shifts and gestures integrated with vocalizations and eye gaze than DD children (Mayo et al., 2013). Another study showed that, compared with TD and DD children, children with ASD showed less social gaze in response to the adult’s distress, fewer gaze shifts as responses to the activation of the mechanical toys, and less imitation of modelled actions. During free play, the autistic children presented fewer gaze shifts between people and objects compared to the DD and TD children and spent less time looking at people (Wetherby et al., 2007). This highlights the significantly less joint attention in those with ASD compared to TD and DD children (Mayo et al., 2013). In summary, autistic children display fewer acts of communication: expressions, gaze shifts, and gestures.
The first studies that investigated auditory language processing used positron emission tomography (PET) and functional magnetic resonance imaging (fMRI) and found reduced temporal activation and left lateralization of activation in those with ASD compared to controls. A study that looked at fMRI activation during a language task in adolescents with ASD found a correlation in left frontal and temporal activation in controls but not in ASD individuals. This suggests less efficient connectivity in those with ASD because of differences in communication between important language areas (Knaus et al., 2008). Electroencephalography (EEG) and fMRI studies of individuals with autism have seen disrupted connectivity, which results in the brain functioning as a less cohesive unit (Saffin and Tohid, 2016).
In a large scale MRI study, the ASD group showed increased cortical thickness compared with the control group, especially in the left hemisphere. All cortical regions adjacent to the arcuate fasciculus, including Broca’s area (speech production) and Wernicke’s area (speech comprehension), showed increased cortical thickness. It is important to note here that the left hemisphere of the brain that mainly controls language. The relation between the severity of social and communication symptoms and the cortical thickness was strongest in regions next to the left arcuate fasciculus (Khundrakpam et al., 2017).
Several imaging studies showed that the difference in total gray matter volume in people with ASD compared to TD individuals was larger in early childhood (about 12% at 2 years) compared to early adulthood (about 2% at 19 years), with those with ASD having a higher volume. During adolescence, cortical thickness gradually declines in TD individuals, while it declines more rapidly in individuals with ASD, with little difference in cortical thickness by about 35 years of age. This increased size is suggested to have been due to a delay in regressive neurodevelopmental processes (synaptic pruning), abnormal macroautophagy and a high number of synaptic spines, which trigger unusually high neural losses (Khundrakpam et al., 2017). In another MRI study, both children with autism and their discordant co-twins showed reduced volumes of frontal, temporal, and occipital white matter (Mitchell et al., 2009).
Greater than normal cortical thickness was also found in the fusiform face area, posterior superior temporal sulcus, and frontal eye fields – areas involved in changing aspects of face processing. In a post hoc analysis of a different study, a significant positive correlation was found between residual cortical thickness and the severity of social affect and communication symptom in the left precentral and postcentral gyri, left supramarginal and fusiform gyri, right inferior frontal and middle frontal gyri, and bilateral inferior frontal gyri (Khundrakpam et al., 2017). In addition, negative associations were found between amygdala volumes and social reciprocity, with younger, less affected autistic subjects proposed to have normal to enlarged amygdala volumes, and older, more affected individuals having smaller amygdala volumes (Mitchell et al., 2009).
Studies support the fact that the mirror neurons communicate through various networks including the amygdala-hippocampal circuit, caudate nuclei, cerebellum, and frontal-temporal regions – regions found to be damaged in ASD. EEG and fMRI studies of individuals with autism have seen a lack of activity in the mirror neuron system. It must be noted that the mirror neuron system has previously been associated with empathy, social reciprocation, verbal and non-verbal communication, and language (Saffin and Tohid, 2016).
The superior temporal sulcus (STS) responds most strongly to stimuli that are significant to communication, especially a stimulus that is human voice-specific, and complex. Decreased concentrations of gray matter were found in areas near the STS. Analysis of cortical sulcal maps in autistic children showed right anterior shifting of the STS interpreted by the authors as delayed or incomplete sulcal development. Social stimuli like facial expressions and voices evoke less STS activity in autistic individuals over controls, possibly due to the lack of taking social communicative meaning from gaze directions, expressions, or vocal sounds. In TD individuals, greater activation is seen in the STS during inference of the goal or intention of another person, compared to simply perceiving biological motion. This contributes to the implication of the STS in the theory of mind perception (Redcay, 2008). Theory of mind, the cognitive ability to infer the mental states of others, is often impaired in those with ASD (Senju, 2012).
In summary, there are changes in activation (left lateralization) and structure (gray matter volume, white matter volume, cortical thickness) the brains of those with ASD. These changes are especially there in language areas in the frontal and temporal lobes and the degree of the change correlates with the severity of the symptoms.
Individuals with autism often reach developmental milestones later than typically developing children. Evidence supports the idea that the later a child reaches a developmental milestone, the worse their cognitive ability and adaptive skills will be in the future. This opposes the “wait and see approach” and strengthens the “watch and see” ideology during which development is monitored over short periods of time and intervention given as soon as delayed development is observed (Mayo et al., 2013).
Autistic children also have lower gesture rates, similarly to how they often have lower speech-related vocalizations (Warlaumont et al., 2014). They also differed from TD children with their lack of proper gaze, lack of coordination of gaze, expression, gesture, and sound, lack of sharing interest, and repetitive movements (Wetherby et al., 2007). Gesture production and comprehension are part of the later onset of speech: infants usually first name objects that were referred to with gestures and later transition to the two-word stage with gesture-speech combinations (Ramos-Cabo et al., 2019). This suggests that possibly addressing lower gesture rates or late gesture production in young children may better their language development in the future.
In the brain, there are differences in connectivity, gray matter volume, and cortical thickness. These differences are especially seen in language areas in the frontal and temporal lobes. In the future, it is possible that these neuroanatomical differences may be used to diagnose ASD and the degree of difference could predict the degree of the symptoms. These differences might help physicians diagnose ASD reliably at an earlier age, allowing earlier treatment.
The differences in synaptic pruning and the large differences in brain size before that raise a question about the role of the synaptic pruning in helping the symptoms of ASD improve. Gernsbacher (2016) wrote that, in the first few years of life, the developmental trajectories of children with autism are flatter with a steep growth observed later. The question of whether inducing synaptic pruning early in childhood when the disorder is known could help the symptoms is yet to be answered.
The heterogeneity of ASD must be emphasized again, as every child with ASD is different in their symptoms and in their responses to intervention. Some of the findings mentioned may not be accurate about some children but they may be accurate about other children. This field has only begun to understand why there is reduced socialization in autistic individuals; decreased communicative acts could be due to impaired motor skills, less motivation to communicate, abnormal brain structure and activation, or any combination of such factors (Warlaumont et al., 2014). Further research is needed to truly shed light on what underlies these phenotypes.
Preethi Nalluru, Youth Medical Journal 2021
Bacon, E. C., Osuna, S., Courchesne, E., & Pierce, K. (2019). Naturalistic language sampling to characterize the language abilities of 3-year-olds with autism spectrum disorder. Autism, 23(3), 699–712. https://doi.org/10.1177/1362361318766241
Brooks, R., and Meltzoff, A. N. (2008). Infant gaze following and pointing predict accelerated vocabulary growth through two years of age: a longitudinal, growth curve modeling study. J. Child Lang. 35, 207–220. doi: 10.1017/ S030500090700829X
Kenworthy, L., Wallace, G. L., Powell, K., Anselmo, C., Martin, A., & Black, D. O. (2012). Early language milestones predict later language, but not autism symptoms in higher functioning children with autism spectrum disorders. Research in Autism Spectrum Disorders, 6(3), 1194–1202. https://doi.org/10.1016/j.rasd.2012.03.009
Khundrakpam, B. S., Lewis, J. D., Kostopoulos, P., Carbonell, F., & Evans, A. C. (2017). Cortical thickness abnormalities in autism spectrum disorders through late childhood, adolescence, and adulthood: A large-scale mri study. Cerebral Cortex, 27(3), 1721–1731. https://doi.org/10.1093/cercor/bhx038
Knaus, T. A., Silver, A. M., Lindgren, K. A., Hadjikhani, N., & Tager-Flusberg, H. (2008). fMRI activation during a language task in adolescents with ASD. Journal of the International Neuropsychological Society, 14(6), 967–979. https://doi.org/10.1017/S1355617708081216
Kover, S. T., Edmunds, S. R., & Weismer, S. E. (2016). Brief Report: Ages of Language Milestones as Predictors of Developmental Trajectories in Young Children with Autism Spectrum Disorder. Journal of Autism and Developmental Disorders, 46(7), 2501–2507. https://doi.org/10.1007/s10803-016-2756-y
Mayo, J., Chlebowski, C., Fein, D. A., & Eigsti, I.-M. (2013). Age of first words predicts cognitive ability and adaptive skills in children with ASD. Physiology & Behavior, 176(3), 139–148. https://doi.org/10.1007/s10803-012-1558-0.Age
Mitchell, S. R., Reiss, A. L., Tatusko, D. H., Ikuta, I., Kazmerski, D. B., Botti, J. A. C., Burnette, C. P., & Kates, W. R. (2009). Neuroanatomic alterations and social and communication deficits in monozygotic twins discordant for autism disorder. American Journal of Psychiatry, 166(8), 917–925. https://doi.org/10.1176/appi.ajp.2009.08101538
Osterling, J., & Dawson, G. (1994). Early recognition of children with autism: A study of first birthday home videotapes. Journal of Autism and Developmental Disorders, 24, 247–257.
Patten, E., Belardi, K., Baranek, G. T., Watson, L. R., Labban, J. D., & Oller, D. K. (2014). Vocal patterns in infants with Autism Spectrum Disorder: Canonical babbling status and vocalization frequency. Journal of Autism and Developmental Disorders, 23(1), 1–7. https://doi.org/10.1007/s10803-014-2047-4.Vocal
Ramos-Cabo, S., Vulchanov, V., & Vulchanova, M. (2019). Gesture and language trajectories in early development: An overview from the autism spectrum disorder perspective. Frontiers in Psychology, 10(MAY). https://doi.org/10.3389/fpsyg.2019.01211
Redcay, E. (2008). The superior temporal sulcus performs a common function for social and speech perception: Implications for the emergence of autism. Neuroscience and Biobehavioral Reviews, 32(1), 123–142. https://doi.org/10.1016/j.neubiorev.2007.06.004
Sperdin, H. F., & Schaer, M. (2016). Aberrant development of speech processing in young children with autism: New insights from neuroimaging biomarkers. Frontiers in Neuroscience, 10(AUG), 1–15. https://doi.org/10.3389/fnins.2016.00393
Tek, S., Mesite, L., Fein, D., & Naigles, L. (2014). Longitudinal analyses of expressive language development reveal two distinct language profiles among young children with autism spectrum disorders. Journal of Autism and Developmental Disorders, 44(1), 75–89. https://doi.org/10.1007/s10803-013-1853-4
Wetherby, A. M., Watt, N., Morgan, L., & Shumway, S. (2007). Social communication profiles of children with autism spectrum disorders late in the second year of life. Journal of Autism and Developmental Disorders, 37(5), 960–975. https://doi.org/10.1007/s10803-006-0237-4
Wetherby, A., Woods, J., Allen, L., Cleary, J., Dickinson, H., & Lord, C. (2004). Early indicators of autism spectrum disor- ders in the second year of life. Journal of Autism and Developmental Disorders, 34(5), 473–493.
Poor eyesight is becoming a common, widespread issue across the globe. Individuals are developing problems related to vision ranging from severe eye conditions to slight impairment more often than ever. Though some visual impairments are not correctable, the most common forms–such as errors of refraction–are. However, before such treatments for poor eyesight can be investigated, a solid understanding of where vision impairments stem must be understood.
Understanding Poor Eyesight
In comparison to the other four key senses, a larger part of the brain is dedicated to vision, making it one of the most highly developed sensory organs. This not only makes vision a key component of the body, but also one of the most vulnerable.
The most common forms of vision impairment are errors of refraction, which are when light rays fail to focus inside the eye and, thus, transfer blurry images to the brain. Examples of errors of refraction include nearsightedness, farsightedness, and astigmatism. However, not to worry, such vision impairments are typically correctable using glasses, contact lenses, or refractive surgery- LASIK, Photorefractive keratectomy, or Implantable Collamer Lenses.
Furthermore, another common form of poor eyesight is eye strain. This occurs when individuals overuse their eyes for an extensive period of time and can also be due to an uncorrected refractive problem. Eyestrain may occur while performing distant visual activities or prolonged focusing (driving, watching a movie, reading, computer use, etc). Eyestrain is not common among children due to their flexible focusing capacity, however, if the eyes are not given adequate rest, adults may experience its effects through headaches, brow aches, eye fatigue, or a pulling sensation. Additionally, experiencing these symptoms while wearing glasses could indicate the need for an eye prescription change.
Other forms of vision problems often relate to eye disease. Examples of such include retinal detachment, macular degeneration, cataracts, and glaucoma. These can lead to blurry or defective vision and require surgery for correction.
How Common Is The Problem?
These vision impairments are not uncommon among individuals worldwide. At least 2.2 billion people have poor eyesight, 1 billion of which could have been prevented or have yet to be addressed. Such unaddressed vision impairments are most common in low and middle-income regions as the statistics report about 4 times as many cases like that of high-income regions. In some areas, eye care needs are greater in rural areas but the services are in urban hospitals, leading to low accessibility. Furthermore, eye conditions are projected to worsen in the future due to the aging population, genetics, ethnicity, lifestyle, and environmental factors. Thus, it is crucial to limit the chance of such a situation by understanding what makes eyesight worse and what can help it improve.
What 4 Key Practices Impact Vision Negatively?
An excess amount of screentime is one of the most widely known factors that impair vision. According to the American Optometric Association (AOA), the average American spends seven hours a day on their digital device and 58% of adults have experienced digital eye strain as a result. Additionally, according to Common Sense Media’s 2019 Census, eight to twelve-year-olds spend five hours on digital media. These individuals, especially if they spend more than two continuous hours using a digital screen, have a much greater risk of eye strain. This can be very damaging to the eyes and is often the most common cause of worsening vision problems.
Wearing contact lenses incorrectly is another common cause of impaired vision. The AOA found that 90% of the 45 million contact lens wearers in the US don’t follow proper hygiene instructions. ⅓ wear non-overnight contacts while sleeping, which can lead to inflammation, dry eye, pain, blurry vision, and light sensitivity. These all deprive the eyes of oxygen and can worsen eyesight as a whole. Furthermore, many also wear contact lenses in the shower or pool, leading to bacteria getting into the eyes and ultimately causing an infection.
Next, not wearing sunglasses to protect from UV radiation causes both long and short-term damage. Just one day at the beach without eye protection can lead to photokeratitis– or sunburn of the eye– which can be temporary, but very painful. Long-term damage from UV rays can lead to the formation of cataracts and pterygium which causes abnormal covering on the white of the eye, impairing vision for a lifetime. There is also a risk for cancer on the eyelids, the skin around the eye, or the eye itself. Additionally, children’s eyes are much more vulnerable to damage from sunlight because their eye lenses cannot filter such UV rays easily, causing damage to the retina. The AOA reports that the average child is exposed to three times the annual UV exposure of an adult and 80% of such exposure occurs before the age of 20.
Finally, the heavy use of eye drops is a common cause of vision impairment. Though it may seem to be helping the eye, this washes away natural tears. Prolonged use of eye drops can cause dependency on them, leading to the eye being unable to self-moisturize and protect its delicate layers. The glands can also get clogged because they will not secrete the proper oils to hold tears in place. In relation, whitening eye drops used to get red out of the eye can decrease blood flow, preventing oxygen from getting into the eye. In turn, the blood vessels can grow enlarged and become even redder due to their inability to deliver oxygen.
What 4 Key Practices Impact Vision Positively?
Just as crucial as understanding what worsens vision is understanding what can better it. However, there is no specific method for improving vision directly without the use of corrective measures. This is because eye shape determines the level of refractive error and this cannot change with exercises or eye training. Thus, there is only one possible way to improve eyesight: to naturally improve the way in which the brain and eyes work together by improving eye health and, therefore, vision as a whole.
Eating a balanced and healthy diet rich in antioxidants and vitamin A–such as leafy vegetables, carrots, or fish–is incredibly important. Such foods slow down age-related vision loss by strengthening connections between the brain and the eyes. Getting enough sleep can also aid in bettering vision as by being tired, the eyes get strained easily and feel dry and gritty. Exercising regularly has shown signs of enhancing circulation of blood and oxygen flow to the eyes, decreasing dry eyes, and preventing vision loss. Protecting the eyes from UV rays and practicing good eye hygiene are also especially important. Washing the hands and face thoroughly and regularly and keeping cosmetics and other chemicals outside of the eyes can prevent potential damage. Lastly, taking breaks from screen time is essential. The AOA suggests looking 20 feet away every 20 minutes for 20 seconds at a time to prevent eye strain.
As a whole, eye conditions are very common among the current population and the best thing that can be done to prevent them is to make healthy life choices from a young age and see the eye doctor regularly. Taking care of the eyes is crucial and must be prioritized more often to avoid more individuals causing damage to their vision.
Epidemic is defined by the Oxford English Dictionary as: “a widespread occurrence of an infectious disease in a community at a particular time”. Malaria is classed as an epidemic, currently affecting over 100 countries predominantly in the tropical regions – and is the fourth highest cause of death in children under the age of five years1. Despite the extensive death and casualty toll, as well as the havoc wreaked on socio-economic conditions in areas of outbreak, these epidemics receive significantly less media coverage and humanitarian attention as they are affecting the developing world. Even after the original health threat has been managed, developing countries continue to face serious long term effects compared to more economically developed ones. This can be shown by what activists refer to as the ‘10/90 Gap’: this is the idea that only 10% of global health research is allocated towards diseases responsible for 90% of preventable deaths globally. Diseases that make up this 10% can be referred to as ‘neglected diseases’, typically because they predominantly affect lower-income countries where poverty and malnutrition are rife – and so exacerbate the spread of such life-threatening communicable diseases. According to the World Health Organisation, 45% of the ‘disease burden’ in the most under-developed and poorest countries is derived from poverty. Therefore, these diseases cannot merely be treated medically, but the deeper social problems in the regions need to be tackled as well.
Malaria is an example of a disease which can be understated and disregarded for its impact due to societal prejudice against less economically developed regions of the world. Malaria mostly affects tropical regions, including (but not limited to) large zones of Africa, particularly in sub-Saharan Africa, South America, the Dominican Republic, the Caribbean and Central America2. While there are recorded cases of malaria in developed countries such as the USA and the UK, this is almost exclusively the result of travellers returning from holiday from countries where malaria is prevalent. 3Malaria disproportionately affects the continent of Africa and, in 2019, 94% of malaria cases and deaths as a result of this disease were in the African continent. Nearly 50% of the global population was at risk of contracting malaria in 2019, and that same year there were approximately 229 million recorded cases of the disease across the world. Despite the fact that malaria is both curable and preventable, this disease has significantly less funding compared to diseases and conditions which disproportionately affect the developed world – such as obesity, which is the fifth most important risk factor for disease amongst developed countries. In 1986, the US spent approximately $39 billion on tackling obesity, and this rose to $190 billion by 20054.
How malaria affects the body:
Malaria is caused by a plasmodium, and there are five different parasite species which can spread malaria – the two most predominant species being P. falciparum and P. vivax3. The former is the main cause of malaria in Africa, south-east Asia, and the Pacific while the latter presents the greater threat in South and Central America. These species of plasmodium parasite are transmitted by an animal vector: the female Anopheles mosquito. In rarer instances, malaria can be transmitted by sharing unsterilised needles, via blood transfusion or from the mother to the foetus. When the female Anopheles mosquito takes a blood meal from a host who is infected with malaria, it will inject saliva into the host’s skin while sucking blood through its proboscis. This saliva acts as a form of antiseptic and is why it is difficult to notice when the mosquito is doing this. As the mosquito takes up this infected blood, the male and female gametes of the malaria-causing plasmodium will fuse in the mosquito’s stomach. Cell division will be carried out then, which will lead to the formation of thousands of immature malarial parasites – these parasites will then invade the salivary glands of the mosquito. This means that when the mosquito takes a blood meal from an uninfected host, it will inject its saliva – containing the plasmodium parasites – into the host’s skin. Thus, the parasites are then able to infect the host’s bloodstream and this results in the manifestation of the disease. One reason why the majority of malaria cases and deaths are in the continent of Africa is due to the long lifespan of the African vector species of female Anopheles mosquitoes. This means that the parasite has a longer time to develop inside the mosquito, and so more parasites can be produced.
3The symptoms of malaria typically show 10 to 15 days after an individual is bitten by a mosquito vector. Some early symptoms can include a headache, fever, and chills – but these are all typically mild and so not necessarily immediately discernible as malaria. However, the severity of the disease soon increases, as failure to receive treatment within 24 hours of the first symptoms when infected by the P. falciparum parasite can mean that the disease progresses swiftly and often culminates in death. Severe cases of malaria can also include the following symptoms: severe anaemia, liver damage and multiple-organ failure. Children are the group most greatly affected by malaria, especially under the age of five years – and in 2019, 67% of all malaria deaths globally (approximately 274 000 people) were children below five years of age. Other high-risk groups of people are pregnant women, non-immune migrants/travellers and individuals with HIV or AIDS.
As malaria is transmitted by mosquito vectors, the most straightforward and accessible method of prevention is mosquito nets – which can be made more effective if treated with insecticide. This, in conjunction with personal use of insect repellent has been shown to reduce the risk of malarial infection by up to 80%5. Also, wearing clothing that limits skin exposure can help to reduce the chance of receiving a mosquito bite which could potentially be fatal. Individuals who are travelling to countries where there are cases of malaria can obtain medication to prevent them contracting the disease.
Malarial chemoprophylaxis is only available in European countries, and only for travellers to countries where malaria is prevalent – and not for the inhabitants of the areas affected6. Malarial chemoprophylaxis is classified into 3 groups in order to determine the most suitable drug for the individual. The drug recommended depends on factors such as duration of potential exposure, age, and climate of the destination.7 Antimalarial tablets can reduce the chances of becoming infected with malaria by approximately 90%. The main types of antimalarial medication are: atovaquone plus proguanil, doxycycline, mefloquine/larium and chloroquine plus proguanil. Chloroquine plus proguanil is still available for travellers but is rarely recommended now due to its ineffectiveness against P. falciparum but can still be prescribed if the individual is visiting an area where this plasmodium is less common, as in Sri Lanka.
Recently, there has been promising research into a new malaria vaccine, developed at the University of Oxford’s Jenner Institute. In a small clinical trial, involving 450 children, this vaccine showed up to 77% efficacy – a dramatic increase from the current vaccine’s efficacy8. Undoubtedly larger clinical trials are needed to ensure the safety and effectiveness of this vaccine, but surely this research should be pushed ahead – as the Covid-19 vaccines were – when millions of people die every year from malaria. This is not to dispute the urgent global need for vaccines against Covid-19, but surely when a disease is so widespread and life-threatening as malaria is it demands the same urgency to tackle the endless death toll each year. Yet this is not the case, and one key reason why is the fact that it mainly affects lower-income, under-developed countries.
9There is presently only one vaccine against malaria and has the brand name Mosquirix. It requires four injections but even then, only offers approximately 30% protection against severe malaria, and only for up to four years. This raises arguments as to whether this vaccine is cost efficient. Additionally, there are further concerns over the safety of this vaccine; in a clinical trial for Mosquirix, the children who had received the vaccine had a risk of contracting meningitis that was 10 times higher than the children who had received the placebo. While there is not sufficient evidence to show that causation, this does hinder the potential safety of this vaccine and has, as a result, impeded rollouts of it. The new vaccine for malaria could gradually replace Mosquirix, and can mean that more individuals are protected against this disease.
Malaria is a curable disease and is treated using antimalarial drugs. The most common of these drugs is Chloroquine phosphate, but unfortunately this treatment is gradually being rendered ineffective due to increasing resistance of malarial parasites to it. Another type of antimalarial drug treatment is Artemisinin-based combination therapies (ACTs); ACT is actually a combination of antimalarial drugs and is used mostly where there is resistance to chloroquine phosphate. Primaquine phosphate is another frequently used antimalarial drug, in addition to quinine sulfate with doxycycline10. Noticeably, many antimalarial drugs contain the word ‘quine’ or ‘quinine’, as they often contain quinine, which is a chemical compound naturally derived from the bark of the cinchona tree.
Currently the Mayo Clinic is carrying out a clinical trial entitled “A Study to Evaluate Intravenous Artesunate to Treat Severe Malaria in the United States”11, and this study hopes to make intravenous artesunate available for treatment in cases of severe malaria. As of yet, there are no publications for this clinical trial.
The global management of malaria:
Currently there are global initiatives attempting to end the spread of malaria. One such example is the Mekong Malaria Elimination (MME) programme organised by the World Health Organisation (WHO)12. The MME programme is working towards eliminating malaria in Myanmar, Cambodia, the Lao People’s Democratic Republic, China, Vietnam, and Thailand, and began in 2017 in response to the increasing ineffectiveness of specific antimalarial drugs as a result of drug-resistant malarial parasites. On the 3rd of November 2020, Cambodia committed to completely eradicating P. falciparum by 202313. Dr Li Ailan, who is the WHO Representative to Cambodia stated: “. Cambodia, being very close to the goal, can be the first country in the region to eliminate P. falciparum malaria, serving as a champion in the Greater Mekong Subregion.” As part of this auspicious commitment, three main interventions have been set out: the distribution of mosquito screens and nets, malaria screening involving weekly fever screening for each household – and any individual with a fever will be tested for malaria and then receive treatment if they are positive for the disease, and furthermore, improved preventive measures for travellers to areas which are at-risk of the spread of malaria.
Arguably malaria, as a disease alone, is theoretically relatively easy to prevent and treat: through insecticides, mosquito nets, vaccinations and antimalarials. In reality, the countries most affected by this disease do not have sufficient funds to provide this, let alone tackle the deep-rooted causes of poverty, poor access to clean drinking water and sanitation infrastructure, and malnutrition in addition to food insecurity – which collectively establishes an environment in which diseases, like malaria, can reach levels of epidemic. There needs to be a greater collective, global effort to tackle so-called ‘poverty-related diseases’ like malaria, but also cholera, typhoid and diphtheria to name more, as there was with smallpox and as there is now with Covid-19. That is not to discriminate against developed countries using their funds to further research healthcare and medicine in their own countries – as this is essential for advancing medicine and understandably they want to improve healthcare in their own countries first and foremost. Furthermore, it is also not to say that wealthier nations should extensively increase the amount of aid they give to ‘fix’ the healthcare problems in other nations – but, when aid is given, it should perhaps be directed to try and address the larger and more long-term socio-economic problems that allow diseases to manifest.
When aid is given in order to reduce the spread of malaria in the worst-impacted regions, it should simultaneously be used to improve the conditions which allow the disease to manifest and become so widespread. However, simply providing money to ‘fix’ these socio-economic problems is not a straightforward answer – as it ignores the presence of factors such as corruption, debt and prioritising immediate humanitarian aid over the longer-term social problems which are more difficult to fix due to their longevity and severity. Referring back to the ‘10/90 Gap’, this balance between the funding of disease mainly affecting the developed world and the ‘neglected diseases’ more prevalent in the developing world needs to be readdressed. When globally the disease burden is significantly greater in less developed regions, more aid needs to be directed towards this rather than disproportionately towards diseases that are less widespread and prevalent which impact more developed regions.
There is no single, clear solution to the problem of malaria – and more broadly the ‘10/90 Gap’. However, it is undeniable that if malaria were as rampant across the developed world as it is presently in the developing world then there would undoubtedly be global upheaval to tackle this disease, and such a gap in funding would not be so significant.
5. Hill N, Lenglet A, Arnéz AM, Carneiro I. Plant based insect repellent and insecticide treated bed nets to protect against malaria in areas of early evening biting vectors: double blind randomised placebo controlled clinical trial in the Bolivian Amazon. BMJ. 2007;335(7628):1023.