The Bone Splint System: a Revolutionary Treatment for Osteogenesis Imperfecta

Osteogenesis imperfecta (OI) is a group of genetic disorders that mainly affect the bones. Characterized as “a hidden autosomal inheritance disorder,” this condition can lead to imperfect formation of the whole skeletal system. Some outcomes of having this disorder include – but are not limited to – fragile bones, short stature and other malefactions. Individuals with mild forms of OI typically have a blue or grey hue to the part of the eye that is usually white (the sclera), and may also develop hearing loss in adulthood.2 Severe bone fracture can also occur, primarily in the form of those that occur even prior to birth, latently disguised in genes.

AksharaBone1

Both the severe and minor fractures bones can be treated via a novel therapeutic method called “the bone splint system.” One study published in the Indian Journal of Medicine, called Results of a bone splint technique for the treatment of lower limb deformities in children with type I osteogenesis imperfecta by Ling et al. shows that patients given treatment through this system could actually walk without other forms of physical support.

AksharaBone2

The team conducting this study analyzed the site of the deformity on the femur of children suffering from type I OI. Since bones of OI patients “bend anterolaterally,” the site was incised both at the lateral and anterior sections. To establish the splint system, a plate was first placed on the lateral side of the femur to span the whole deformity. Two screws were put on each end of the plate and stabilized with insertion into three layers [Figure 1].1 This achieved the so-called bone splint technique composed of the steel plate, screws and what is known as an “allograft.” Because this technique evenly distributes the load with bone and the splint, it permits “decreased bending and torsion stress on the plate.” Consequently, performance of the fixation was increased and the risks of fracture at the implant were reduced.1

This method proves to be a comparatively permanent cure, demonstrated by the fact that the fixation remained stable throughout the follow-up and even after over 5 years. In all cases, “there was no evidence of loosening or breakage of screws.” Additionally, the bone splint technique favored faster healing of the bone fracture by blocking any intruding cartilage in the fracture section. Ultimately, increased bone mass and improved the strength of the bone along with successful healing designate the bone-splint system as an effective treatment for lower limb deformities in children with type I OI.

Works Cited

  1. Ling, Dasheng, Wenliang Zhai, Kejian Lian, and Zhengi Ding. “Results of a Bone Splint Technique for the Treatment of Lower Limb Deformities in Children with Type I Osteogenesis Imperfecta.” Indian Journal of Medicine. Indian Journal of Orthopaedics, 12 July 2013. Web. 29 Feb. 2016. <http://www.ijoonline.com/showBackIssue.asp?issn=0019-5413;year=2013;volume=47;issue=4;month=July-August>.
  2. “Osteogenesis Imperfecta.” Genetics Home Reference, 1 Apr. 2013. Web. 1 Mar. 2016. <https://ghr.nlm.nih.gov/condition/osteogenesis-imperfecta>.
  3. Sinder, BP, Et Al. “Result Filters.” National Center for Biotechnology Information. U.S. National Library of Medicine, 1 Jan. 2016. Web. 15 Feb. 2016. <http://www.ncbi.nlm.nih.gov/pubmed/26769006>.
  4. p., n.d. Web. 5 Mar. 2016. <http://cache3.asset-cache.net/xt/525443937.jpg?v=1&g=fs1%7C0%7CSPF%7C43%7C937&s=1>.

 

Posted in Uncategorized | Comments Off on The Bone Splint System: a Revolutionary Treatment for Osteogenesis Imperfecta

Increasing the Human Brain Memory Capacity: Drugs that Enhance Norepinephrine and Dopamine Functionality in the Prefrontal Cortex

The human brain is the single most complex structure known to man in the entire universe. Humans have surpassed all other life forms on Earth owing to the power of the human brain. However, there is always the constant drive to reach even greater heights. Questions such as “What is the size of the universe” or “What is the meaning of life” are yet to be answered. In order to take a stab at these questions or to even begin thinking about possible solutions, the human brain certainly requires further enhancements. Alas, greater brain capacity is necessary merely to survive the cutthroat competition in modern society!

AksharaChalla1

For instance, have you ever had trouble recalling a crucial equation during an exam right after hastily trying to “memorize it” or had trouble remembering someone’s name momentarily after his or her introduction? These processes invoke the short-term memory portion of the brain known as working memory. Studies have determined that the prefrontal cortex (PFC) of the brain is a region implicated in working memory processes [1]. Some techniques have been applied to train the brain to execute tricks such as chunking to retrieve more from short-term memory—not necessarily increasing the ‘physical storage’ capacity in memory.

Therefore, there has been great interest recently geared towards the enhancement of working memory capacity of the brain. After much research on rats, monkeys and other related mammals, scientists have begun trials on humans to figure out ways to enhance one’s working memory capacity. Current studies on catecholamines (adrenergic neurotransmitters), a family of neurotransmitters in the brain including norepinephrine and dopamine, demonstrate that there could be significant improvements in the human working memory by increasing their functionality [2]. These neurotransmitters are abundant in the prefrontal cortex and play a critical role in the neural pathways that contribute to working memory.

More specifically, while norepinephrine strengthens PFC network connectivity and maintains persistent firing during a working memory task, dopamine targets D1 receptors to narrow spatial tuning, sculpting network inputs to decrease noise [4]. Norepinephrine is synthesized from dopamine by dopamine beta-hydroxylase in neurons. It is first transported into synaptic vesicles and then moved along the axons that comprise the noradrenergic bundle to release sites where it activates certain protein cascades at synaptic clefts [3]. Essentially, scientists are now trying to facilitate similar function of this memory transmitter system using synthetic medications in place of catecholamines.

AksharaChalla2

Thus, a plethora of research has since been undertaken to determine the drugs that could mimic the catecholamine effects. One of the main goals is to replicate network connectivity strengthening effects of norepinephrine at postsynaptic alpha-receptors in the PFC and enhance spatial tuning (selectivity) like dopamine. After determining that there is an empirical link between a dopamine-mediated working memory system and higher cognitive functions in humans [1], researchers administered dopamine receptor agonists such as bromocriptine and pergolide on young volunteers. Because bromocriptine binds to the same receptors as dopamine and plays the same role as catecholamines, the results were astonishing. A few milligrams of bromocriptine over the period of around 2 weeks not only enhanced memory but also improved attention—similar to what high catecholamine levels would accomplish [8]. These and other similar results are extremely promising and will definitely usher us towards a new age of brilliance by enhancing human memory capacity.

Although these findings are just in their initial stages and continually evolving, they can serve as a springboard on which further research can be developed. Consider stimulant medications such as methylphenidate (ritalin) which not only improve performance on working memory tasks, but also enhance PFC function by indirectly increasing catecholamine actions through blockade of norepinephrine and dopamine transporters [6]. Furthermore, donepezil is another drug that can potentially improve memory. It is primarily a cholinesterase inhibitor that impedes the activity of the cholinesterase enzyme–effectively decreasing the acetylcholine degradation [5]. Future strides like these will set up the stage for advanced research on this subject and certainly offer an opportunity for us to improve our intelligence—perhaps one day enabling us to finally solve the currently unanswerable mysteries.

 

Works Cited

  1. Kimberg D. Y., D’Esposito M., Farah M. J. (1997). Effects of bromocriptine on human subjects depend on working memory capacity. Neuroreport 8 3581–3585. 10.1097/00001756-199711100-00032
  2. “Catecholamines.” com. Web. 3 Jan. 2016.
  3. Tully, Keith, and Vadim Y. Bolshakov. “Molecular Brain.” Emotional Enhancement of Memory: How Norepinephrine Enables Synaptic Plasticity. N.p., 13 May 2010. Web. 1 Dec. 2015.
  4. Arnsten, Amy F.T. “Catecholamine Influences on Dorsolateral Prefrontal Cortical Networks.” Biological psychiatry12 (2011): e89–e99. PMC. Web. 13 Jan. 2016.
  5. Chuah, Lisa Y.M. et al. “Donepezil Improves Episodic Memory in Young Individuals Vulnerable to the Effects of Sleep Deprivation.” Sleep8 (2009): 999–1010. Print.
  6. Cools, R, and M D’Esposito. “Inverted-U Shaped Dopamine Actions on Human Working Memory and Cognitive Control.” Biological psychiatry12 (2011): e113–e125. PMC. Web. 13 Jan. 2016.
  7. Clark, Kelsey L., and Behrad Noudoost. “The Role of Prefrontal Catecholamines in Attention and Working Memory.” Frontiers in Neural Circuits (2014): 33. PMC. Web. 13 Jan. 2016.
  8. Wallace, Deanna L. et al. “The Dopamine Agonist Bromocriptine Differentially Affects Fronto-Striatal Functional Connectivity During Working Memory.”Frontiers in Human Neuroscience 5 (2011): 32. PMC. Web. 15 Jan. 2016.
  9. 14 Jan. 2016. <http://www.mindbodyvortex.com/wp-content/uploads/2015/09/Frontal-lobe.jpg>.

 

 

 

Posted in Uncategorized | Comments Off on Increasing the Human Brain Memory Capacity: Drugs that Enhance Norepinephrine and Dopamine Functionality in the Prefrontal Cortex

Attack of the Zika

The human body is an impeccable machine made of tiny, living particles that compose every part of our physical existence – cells. Without our conscious thought, our cells produce the necessary proteins, enzymes, and hormonal responses that allow us to function normally on a daily basis.

In comes the virus – a semi-alive being that can interrupt this process entirely. Essentially “an infectious particle made of biological information wrapped in a protein coat”, this tiny invader can wreck havoc on our cells (Brookshire). One of the most consequential invaders most recently has been the Zika virus.

Named after the Zika forest in the East African country of Uganda, this virus hijacks cells of people, most of whom who will remain unaware of the invader in their body. However, one out of five people will get visible symptoms such as pink eye, fever, or rash. Nonetheless, the real scare of the Zika virus is its link to microcephaly – “a condition in which babies are born with small heads and brains that aren’t fully developed” (Brookshire).

The recent wave of these brain disorders and defects in babies caused by the Zika virus prompted the World Health Organization (WHO) to declare a state of public health emergency on February 1 (Rosen). Although it was first spotted in Uganda, the virus – spread by mosquitoes – has infected people in Brazil, Central America, and Mexico. While scientists aren’t precisely aware of how the virus causes these effects, experts believe that the infection of pregnant women from the Zika virus causes their infants to be born with small heads and brain defects.

Brazil alone has reported over 4000 cases of suspected microcephaly since October, and this number is continuously on the rise (Rosen). Furthermore, there have been several cases of the Zika virus reported in the United States – with eight of these cases appearing in California alone. According to San Jose Mercury News, “Half of those have been in Bay Area counties. Contra Costa County has reported two cases, with San Francisco and Napa counties reporting one each” (Seipel). Therefore, people (especially pregnant women) living or traveling in these regions have been warned to avoid mosquito bites (through bug spray, proper clothing, mosquito nets, etc) in order to prevent infection from the Zika virus.

Brookshire, Bethany. “Scientists Say: Zika.” Student Science. Society for Science & the Public, 15 Feb. 2016. Web. 22 Feb. 2016. <https://student.societyforscience.org/blog/eureka-lab/scientists-say-zika>.

Rosen, Meghan. “Zika Worries Go Global.” Student Science. Society for Science & the Public, 2 Feb. 2016. Web. 22 Feb. 2016. <https://student.societyforscience.org/article/zika-worries-go-global>.

Seipel, Tracy. “California Zika Virus Cases Inch up to Eight.” San Jose Mercury News. Digital First Media, n.d. Web. 6 Mar. 2016. <http://www.mercurynews.com/health/ci_29596427/california-zika-virus-cases-inch-up-eight>.

 

Posted in Uncategorized | Comments Off on Attack of the Zika

Second Round for POLARBEAR: Improving Measurements of CMB Polarization

Cosmology (the study of the origin and evolution of our universe) may seem to be one of the most daunting subfields of physics. We can’t travel back in time to see the Big Bang for ourselves or even any light that has escaped from its very beginning. Fortunately, the universe did leave us a trace of its childhood through the Cosmic Microwave Background (CMB). The CMB is made up of the oldest photons we will ever be able to detect.

Just after the Big Bang, our universe existed in a very hot and dense state. Atoms were not able to exist until the universe was about 300,000 years old and had expanded enough so that the temperature fell to about 3000 K (5000°F). After the formation of neutral atoms, photons were able to travel freely without scattering and their orientations or polarization changed.

Here at Cal, Prof. Adrian Lee’s experimental cosmology group is looking at the polarization of the CMB. In an interview in 2014, Adrian used a romantic metaphor to describe this polarization, “Think of it like this: the photons are bouncing off the electrons, and there is basically a last kiss, they touch the last electron and then they go for 14 billion years until they get to telescopes on the ground. That last kiss is polarizing.” He describes the stage in our universe’s history where photons can finally travel freely. More specifically, the POLARBEAR experiment looks at B-modes of the CMB. These B-mode polarization patterns formed at the photons’ last point of scattering, if the matter nearby was unevenly distributed. B-mode polarization also occurs after the photons have begun travelling freely when passing through large gravitational fields. Photons can be both B-mode and E-mode polarized, and these patterns differ geometrically as seen in the image below.

SabrinaBerger1

The current experiment POLARBEAR 1 (PB1) employs microwave detectors on the Huan Truan Telescope in a Chilean desert. POLARBEAR is a collaboration of over 70 researchers around the world and was first used for observations in 2012. PB1 was the first experiment to be successful in detecting pure B-modes that form through gravitational lensing. This experiment enabled the group to determine the total mass along the path of each photon, and their main focus so far has been to map the matter distribution of the universe back to the “inflationary” period of the universe. Inflation is a term used to describe the exponential expansion of the universe in extraordinarily little time very shortly after the Big Bang. POLARBEAR measurements have very wide astrophysical applications, including providing evidence for inflation, constraining the mass of the neutrino, and dark energy’s evolution.

SabrinaBerger2

POLARBEAR 2 (PB2), the newest addition to the POLARBEAR experiment, will include three different experiments and new telescopes with higher sensitivity. Its first update to PB1 will be released for observing sometime late next year. The three telescopes will be built to eventually completely replace PB1 in the next three years. “The biggest deal is that PB2 is much more sensitive than PB1. Our ability to measure the actual signal of the CMB is limited by our number of detectors. PB1 has only a small set of detectors, with only one detector attached to each antenna. Whereas on PB2, we’re making our focal plane, where all of our detectors sit, a lot larger. This enables us to more than double the amount of our detectors,” said Charles Hill, a third year graduate student working on the PB2 experiment.

This international team is working tirelessly to produce the second round of experiments for POLARBEAR.

 

If you would like to read more about the POLARBEAR telescope and see who makes up the team of collaborators from UC Berkeley and beyond, see the experiment’s website below:

http://bolo.berkeley.edu/polarbear/

 

You can read the 2014 UC Berkeley news article about POLARBEAR here:

http://news.berkeley.edu/2014/10/21/polarbear-seeks-cosmic-answers-in-microwave-polarization/

 

You can read more about E-mode and B-mode polarization here:

http://background.uchicago.edu/~whu/intermediate/Polarization/polar5.html

Posted in Uncategorized | Comments Off on Second Round for POLARBEAR: Improving Measurements of CMB Polarization

Activity-Based Protein Profiling: Discovering Novel Therapeutic Strategies for Disease

In the post-genomic era, we are faced with the daunting challenge of translating all of this genomic information into cures for human diseases. One of the major bottlenecks for drug discovery is that much of the genome remains uncharacterized, hampering our efforts to uncover their biological or therapeutic functions. Another challenge is that many of the protein targets that we know to be drivers of human diseases belong to a category known as “undruggable,” meaning that we have no means of developing pharmacological or biologic tools against the target for developing a therapeutic.

Activity-based protein profiling (ABPP) is an exciting technology that is enabling researchers to identify new therapeutic targets and even develop new and safer pharmacological agents for therapeutic discovery. ABPP uses active-site directed chemical probes to directly assess the functional states of large numbers of proteins directly in complex biological systems. These activity-based probes consist of a chemical warhead that binds to functional sites within proteins and a handle which can be used for visualization of these targets by fluorescence or identification of these targets by mass spectrometry-based proteomics. When applied to various disease settings, ABPP has been successfully used to discover many promising therapeutic targets for diseases such as cancer, inflammation, depression and anxiety, obesity, and neurodegenerative diseases. An additional unique feature of this ABPP technology is that it also enables the development of inhibitors against protein targets that are targeted by activity-based probes, even those that are uncharacterized and considered to be undruggable. Because the activity-based probes are binding to functional sites within proteins, small-molecule inhibitors can be competed against the probe binding, enabling inhibitor discovery and accelerating the process of drug discovery. In addition, because the probes themselves are binding across many proteins in-parallel, the selectivity of these inhibitors can be assessed on a proteome-wide level. Developing selective inhibitors that specifically inhibit the therapeutic target but not unrelated off-targets help in mitigating toxicities and side-effects, thus contributing to new and safer drugs.

PetriPic1.jpg

ABPP, started in Benjamin Cravatt’s laboratory at The Scripps Research Institute, has contributed to several interesting discoveries. For example, Cravatt and Daniel Nomura, now an Associate Professor at UC Berkeley, demonstrated that the enzyme monoacylglycerol lipase (MAGL) controls a fatty acid network that contributes to tumor growth (6). Their study showed that MAGL could be a target for cancer treatment and, curiously, suggested that there is a link between a high-fat diet and cancer progression. Nomura and Cravatt also showed that MAGL inhibition in the brain leads to elevations in endogenous cannabinoid lipid levels that act on the cannabinoid receptors (the same receptor that THC from marijuana binds) and also lowers pro-inflammatory eicosanoids lipid levels to inhibit inflammation and protect and neurodegeneration in the brain. Very potent and selective MAGL inhibitors have been developed using the ABPP technology and have recently entered Phase I clinical trials in humans through Abide Therapeutics in La Jolla, CA. Nomura’s lab at Berkeley has continued to identify novel therapeutic targets and potential therapeutic leads for treating cancer using the ABPP technology.

Hopefully, in the near future, drugs created thanks to ABPP will be successful in clinical trials.

Acknowledgments

Daniel Nomura has critically read and edited the blog post, sharing his valuable insights into ABPP technology.

References

1. National Cancer Institute. SEER Stat Fact Sheets: All Cancer Sites. http://seer.cancer.gov/statfacts/html/all.html (accessed Nov 22, 2015).

2. The Scripps Research Institute. The Cravatt Lab Research. http://www.scripps.edu/cravatt/research.html (accessed Nov 22, 2015).

3. Medina-Cleghorn, D.; Nomura, D. K. Exploring Metabolic Pathways and Regulation through Functional Chemoproteomic and Metabolomic Platforms. Chemistry & Biology. 2014, 21, 1171-1184. http://www.sciencedirect.com/science/article/pii/S1074552114002361 (accessed Nov 22, 2015).

4. Cravatt, B. F.; Wright, A. T.; Kozarich, J. W. Activity-Based Protein Profiling: From Enzyme Chemistry to Proteomic Chemistry. Annu. Rev. Biochem. 2008, 77, 383-414. http://www.annualreviews.org/doi/pdf/10.1146/annurev.biochem.75.101304.124125 (accessed Nov 22, 2015).

5. Bogyo, M. Finding enzymes that are actively involved in cancer. Proc. Natl. Acad. Sci. U.S.A. 2010, 107, 2379-2380. http://www.pnas.org/content/107/6/2379.full (accessed Nov 22, 2015).

6. Nomura, D. K.; Long, J. Z.; Niessen, S.; Hoover, H. S.; Ng, S. W.; Cravatt, B. F. Monoacylglycerol lipase regulates a fatty acid network that promotes cancer pathogenesis. Cell, 2010, 140, 49-61. http://www.sciencedirect.com/science/article/pii/S0092867409014391 (accessed Nov 22, 2015).

7. Long et al. Selective Blockade of 2-Arachidonoylglycerol Hydrolysis Produces Cannabinoid Behavioral Effects. Nat. Chem. Biology. 2009, 5, 37-44. http://www.nature.com/nchembio/journal/v5/n1/full/nchembio.129.html (accessed Nov 22, 2015).

8. Special Feature. Greatest Hits. Nat. Chem. Biology. 2015, 11, 364-367. http://www.nature.com/nchembio/journal/v11/n6/full/nchembio.1815.html (accessed Nov 22, 2015).

Posted in Uncategorized | Comments Off on Activity-Based Protein Profiling: Discovering Novel Therapeutic Strategies for Disease

Analysing the Link between Global Warming, Hurricane Patricia, and Future Tropical Storms

PallaPic1

For a brief time, Hurricane Patricia had taken America by storm (pun definitely intended). On the night of Wednesday October 21st, Patricia was an under-the-radar tropical depression that drew little attention. Then, due to a combination of high ocean temperatures, low pressures, and low wind currents, Patricia began to grow at a rate that astounded and terrified not only scientists, but much of America as well (1). By the morning of Friday October 23rd, Patricia was “the most powerful storm ever measured by the U.S. Hurricane Center” (2).

Patricia’s rapid growth and destructive power has largely been attributed to the weather phenomenon known as El Nino, when the temperature of the Pacific Ocean near the equator rises and the air pressure in the eastern Pacific Ocean drops. But as Patricia was dominating headlines across the country, many media outlets also chose to focus on the effect climate change, more specifically global warming, has had on the destructive potential of hurricanes. Many of these articles argued that as global warming continues, tropical storms will increase in strength and danger, and cite Patricia as an example. Furthermore, many of these pieces called for steps to counter global warming to mitigate the future danger of hurricanes, as well as advocated for stronger hurricane defenses to prepare for more powerful storms. However, a number of articles also proposed the opposite argument – Patricia’s record-breaking strength is not the result of global warming.

But looking outside of the realm of popular science, is Patricia evidence that global warming is causing more dangerous hurricanes?

The link between global warming and Patricia is tenuous at best; it is difficult to divorce the amplification of Patricia due to El Nino from the amplification due to climate change. Moreover, no single weather phenomenon can be solely attributed to a large-scale trend like global warming, nor should a single event or storm be held as indicative of a trend as widespread as climate change. While no scientific literature exists on the subject (Patricia is too recent a phenomenon), climate experts like Kerry Emanuel, who was among the first to predict that global warming would increase the strength of hurricanes, have declined to state that Patricia specifically is evidence of the link between climate change and hurricane intensity (3).

However, looking beyond Patricia, does science indicate that global warming has led or will lead to more destructive hurricanes?

Well, kind of, but not really. Recently, scientific literature has increasingly found links between climate change and more destructive hurricanes. However, many of these articles stop short of explicitly stating a causal link between global warming and more powerful hurricanes. A 1987 article in Nature by Kerry Emanuel predicted a significant increase in the destructive potential of hurricanes due to greenhouse gas-induced climate change, but the increase of the magnitude of Emanuel’s predictions has not been seen (4).

Emanuel returned to the subject in 2005 and demonstrated an increase in the destructiveness of hurricanes that correlates strongly to increased ocean temperatures (5). Webster, et al. also found an increase in the number of category 4 and 5 hurricanes, between 1970 and 2005, which was, in their words, “not inconsistent” with models that correlated increased hurricane intensity with global warming (6). However, both these studies conceded that the measured increases could be within normal variance of hurricane intensity.

Two recent studies in Nature Geoscience tied increased hurricane intensity to greater economic losses on the United States, but both qualified their results with the statement that increased hurricane intensity cannot clearly be tied back to climate change (7,8).

The end result: scientists have not disproven the notion that global warming will lead to more powerful hurricanes, but they haven’t definitively proven it either. The reality may be that scientists will not be able to definitively state this link exists until these stronger storms are actually upon us. However, even without definitive proof, the evidence in favor of the notion continues to grow.

Works Cited:

  1.       Vance, E. (2015, October 23). How Hurricane Patricia Quickly Became a Monster Storm. Retrieved from http://www.scientificamerican.com/.
  2.       Chandler, A. (2015, October 23). Bracing for Patricia. Retrieved from http://www.theatlantic.com/.
  3.       Mooney, C. (2015, October 23). Why record-breaking hurricanes like Patricia are expected on a warmer planet. Retrieved from https://www.washingtonpost.com/.
  4.       Emanuel, KA. (1987). The dependence of hurricane intensity on climate. Nature, 326, 483–485.
  5.       Emanuel K. (2005). Increasing destructiveness of tropical cyclones over the past 30 years. Nature, 436, 686-688.
  6.    Webster, PJ, et al. (2005). Changes in Tropical Cyclone Number, Duration, and Intensity in a Warming Environment. Science, 309 (5742), 1844-1846.
  7.       Hallegatte S. (2015). Climate change: Unattributed hurricane damage. Nature Geoscience, 8, 819–820.

8.       Estrada, F, et al. (2015). Economic losses from US hurricanes consistent with an influence from climate change. Nature Geoscience, 8, 880–884.

Posted in Uncategorized | Comments Off on Analysing the Link between Global Warming, Hurricane Patricia, and Future Tropical Storms

The Human Microbiome: Slowly Getting There

Reynaldi1Pic2

 

At this point in time, the study of the human microbiome is not a novelty. Quite a lot of time and money has gone into pursuing the promising field, hoping that collecting data from the trillions of microorganisms in and on our bodies will offer insights into how they affect health and diseases. While the microbiome has bene shown to heavily affect us—the food we eat, our immune system and infections, organ developments, even behavioral traits—our knowledge regarding the microbiome is still extremely limited. The goal of predicting an individual’s propensity for certain diseases (and ultimately preventing them) using the human microbiome seems more distant than not.

Part of the reason of why this research seems to be progressing slowly is the vast amount of data that needs to be processed and the time required to amass it. Specifically, months are required for bacteria collection (mainly from feces—relatively unappealing to the masses and probably another reason the field is not popular) and for gene sequencing. Biotech companies such as Biomiic have started working on how to process and present collected data at a much faster rate (as reported in the following article http://bostinno.streetwise.co/2015/10/05/havard-ilab-startup-microbiome-data-from-poop/)

Once data can be processed more powerfully, perhaps the field will advance rapidly. After all, even the world’s largest collaborative biological project—The Human Genome Project—was only possible because of remarkable progress in sequencing and computing technology.Reynaldi1Pic1

Another reason that often comes up is practicality. To what extent can we utilize microorganisms for therapeutics purposes? A lot of the bacteria seems impossible to be cultured, and even then, we don’t know how effective treatments with microbes are (though in certain cases they have been proven to be very succesful, as in the famous case of C. diff infections: http://www.medicalnewstoday.com/articles/291532.php ).

In any case, the study of the human microbiome is extremely valuable as our microbiome is an integral part of our lives. Perhaps once it gains more popularity and funding, more will be discovered regarding these organisms that call us their home.

Posted in Uncategorized | Comments Off on The Human Microbiome: Slowly Getting There

The Enigma of the Brain

SarahRockwood

The 1,200 cm^3 mass of neurons inside our heads, more commonly known as the brain, has been frustratingly elusive in its nature for as long as we’ve known of its existence. How does it work? What does ours, as humans, do so differently from everyone else’s? What is it about the brain that even makes us “human”? These questions have long been tantalizing scientists, take philosophical dead ends, and leave us with more questions than when we started. But now, a ground-breaking project is bringing us one step closer to answering those questions.

The Human Connectome project is split between two consortia; Washington University, University of Minnesota, and Oxford University launched the first project while Harvard, MIT, and UCLA followed close behind. Over five years, the macro-cortices of 1,200 patients (twins and their siblings) will be analyzed and the data will be used to map the neural connections that create the massive network within our brains. By using resting-state functional MRI and diffusion imaging, the scientists will slowly be able to uncover the details of brain connectivity. With structural and functional MRI, they can also determine the shape of the cortex and the network’s relationship to behavior.

With an estimated 100 billion neurons comprising the average adult brain, there are over 100 trillion possible neuronal connections, or synapses, within every single person in a unique configuration. Taking on the challenge of mapping every single one of those is no small task, but the insights gained from it could revolutionize our understanding of the brain and facilitate research in many brain disorders such as autism, Alzheimer’s, and schizophrenia.

But most of all, this research could reveal a little more to the secret of what makes us human, and what makes every one of us a unique addition to this world.

Posted in Uncategorized | Comments Off on The Enigma of the Brain

Scientists Selling Genetically-Engineered Micro-Pigs

 

AP_micro_pigs_8_sr_140319_16x11_1600

Who doesn’t love things that are fun-sized? While most pet owners would gladly keep their furry friends baby sized forever, a group of scientists in China has taken things a step further. Geneticists from leading genomics research institute BGI in Shenzhen, China have begun selling genetically engineered micro-pigs as pets starting at US$1600.

By deactivating a growth hormone receptor or GHR gene, scientists have effectively stunted the growth of Bama pigs. Normally mature pigs weigh up to 100 pounds, but mature micro-pigs grow to only about 30 pounds, or the size of an average dog. By introducing an enzyme called transcription activator-like effector nucleases, or TALENs, to the cloning process, scientists were able to disable one of two growth hormone genes that cause Bama pigs to mature to their full size.

Of course, cloning Bama fetuses comes with adverse health effects and shortened lifespan, as evidenced by other cloned mammals, such as Dolly the sheep. However, by breeding the genetically engineered male micro-pigs with normal female pigs, half of the offspring are born as micro-pigs without the adverse health effects of being born as clones.

Having more similar genetic and physiological makeup to humans than the typical lab rat, but often rejected for lab work for their large size, micro-pigs were originally intended to serve as subjects for human disease in genetic research. However, a fringe pet market for unusually small animals has given their products new purpose. As of now, BGI states that profit is currently their main objective with their new micro-pigs.

Posted in Uncategorized | Comments Off on Scientists Selling Genetically-Engineered Micro-Pigs

Recent Breakthroughs in the World of 2D Materials

Author: Kevin P. Nuckolls

In the past few years, the search for new and exciting two-dimensional materials has taken over both the field of material science and nanotechnology. These materials have displayed previously unimaginable characteristics, including their novel electronic properties or extraordinary mechanical characteristics, making them some of the best candidates for solving some of the world’s toughest problems in numerous of scientific disciplines. One of the most promising candidates in field of research is graphene, a single-atom thick layer of carbon atoms arranged in a hexagonally tiled formation.

Researchers at Cornell University have become interested in some of the mechanical properties of graphene. Some of the most significant findings have been discovered by a man named Paul L. McEuen, who is the corresponding author on the newly published Nature paper entitled “Graphene kirigami”. McEuen realized that a sheet of graphene, given its strength and resilience, could be used to build complex three-dimensional structures by playing an analogous role to paper in the art of kirigami. The word “kirigami” is derived from the Japanese words “kiru”, meaning “to cut”, and “kami”, meaning “paper”. The ability to fold and cut sheets of graphene into a seemingly infinite number of nanoscale, durable structures could revolutionize the role graphene plays in a number of research fields.

kevinpic1                            Simple kirigami spring made of (a) paper and (b, c) graphene

McEuen’s group first sought to identify whether or not graphene has the correct physical characteristics to be used for kirigami. One of the most important material parameters for kirigami is a material’s Föppl-von Kármán number, which is a ratio of the material’s in-plane stiffness to out-of-plane bending stiffness. Using several different nanoscale material testing methods, his group found that the number associated with graphene was very similar to that of paper, making it an excellent medium for kirigami. With these promising results, his group proceeded to successfully cut and fold sheets of graphene into simple, nanoscale mechanical systems, such as springs and hinges. These devices can be manipulated using not only physical means, but also magnetic means by attaching the ends of the springs or hinges to small blocks of magnetic material. This feature would allow for remote control over such systems, thus allowing for a myriad of new applications in nanotechnology.

Kevinpic2Various forms of graphene kirigami

Researchers at Shanghai Jiao Tong University have explored the possibility of creating new two-dimensional materials out of other group-IV elements, following the precedent set by carbon in forming sheets of graphene. Previously, two-dimensional, silicon-based silicene and germanium-based germanene had been synthesized and examined. The experimentally unprecedented synthesis and characterization of a material called stanene was achieved by a team lead by Jin-feng Jia, who is the corresponding author on the recently published Nature Materials paper entitled “Epitaxial growth of two-dimensional stanene”. Stanene is a 2D allotrope, or atomic configuration, of tin atoms that form a buckled honeycomb lattice, comprised of two triangular, offset sublattices of tin atoms. The thickness of this system is about 0.1nm, which fluctuates slightly depending on the material’s surroundings.

Kevinpic3

Molecular structure of stanene, top and side views

Jia’s group was able to grow these single layers of stanene upon a substrate of Bi2Te3 using a technique called molecular beam epitaxy (MBE). Through MBE, a substrate is placed in a chamber at extremely high vacuum pressures. The material one wishes to deposit is then heated until it becomes gaseous, which condense upon the surface of the substrate. Reflection high energy electron diffraction (RHEED) is many times used to monitor the progress of this process. Jia’s group then analyzed the atomic and electronic structures of the stanene and Bi2Te3 system and found that experimental data they obtained agreed quite well with their theoretical predictions and calculations.

Kevinpic4Image of 2D stanene on Bi2Te3 using scanning tunneling microscopy, top view

 

If you’d like to read more about the developments in graphene kirigami, check out the full paper at the following link:

http://www.nature.com/nature/journal/v524/n7564/full/nature14588.html

If you’d like to read more about the developments of stanene, check out the full paper at the following link:

http://www.nature.com/nmat/journal/vaop/ncurrent/full/nmat4384.html

 

 

Posted in BSJ Blog, BSJ Staff | Comments Off on Recent Breakthroughs in the World of 2D Materials