Friday, December 11, 2020
Wednesday, October 7, 2020
Today, the Nobel Prize was awarded for “genome editing” to Emmanuelle Charpentier and Jennifer Doudna. Essentially this was the CRISPR Nobel Prize. If enough of CRISPR has already come so that it is worthy of a Nobel Prize I can’t imagine there is much place to go from here.
Modifying the genome of organisms has always been of great interest to scientists. Adding or removing genes allows us to understand how things work and allows us to create microorganisms, plants or animals(humans are animals right?) that have traits never seen before. Knowingly modifying the genome of organisms has been done since people understood breeding. Inserting specific DNA elements began in the 1970s but it wasn’t very targeted and was mostly just inserting genes into random places in genomes. It wasn’t until the 1990s that people started to be able to do targeted genome modifications.
Targeted genome modifications allow highly specific and accurate changes to an organism’s DNA. The co-opting of the cellular mechanism of homologous recombination is the backbone of all modern genome editing technologies including Zinc Finger Nucleases(ZFNs), Transcription activator-like effector nuclease(TALENs) and Clustered regularly Interspaced Short Palindromic Repeats(CRISPR). Both ZFNs and TALENs are mostly synthetic, i.e. they contain a portion that can be engineered to target DNA and a synthetically attached nuclease that will cleave DNA to initiate recombination and gene editing. CRISPR is almost completely natural which makes one wonder how there are so many patents on it’s use, yikes.
CRISPR can modify most any living cell but so can ZFNs, TALENs and other technologies. So why then was CRISPR heralded as the discovery of the century by the MIT Tech Review? Simply, CRISPR is just easier to use because it uses nucleic acid targeting. That makes it cost less and take less time to produce genetic modifications.
So what has CRISPR made possible? Not much really. Most everything done with CRISPR can be done with one of these other technologies albeit these others are slower and more expensive. CRISPR allows the scaling of genetic engineering so that time and money are much less of a factor. It is the cloud computing of biology at least in my mind.
Despite claims by scientists and pharma companies there is little chance CRISPR will ever be widely used in the clinic to directly treat disease. That is because it suffers from all the same faults as its predecessors and maybe even more so. Gene editing has low efficiency in adult animals(yes humans are animals) no matter the technique used. For instance, if you have a disease that affects the brain you can probably only modify <1% of cells even using the best delivery techniques available. Really, the only way to get rid of genetically inherited diseases using gene editing is by modifying embryos.
Misleading as it has been CRISPR can’t actually make specific changes to a gene easily in an adult animal. That is because it requires what is called a donor template, basically just a DNA template that cells can use to create the genome modifications. There is no efficient way to use donor templates in an adult animal so all genome edits would need to be gene knock-outs only i.e. you can only use CRISPR to destroy bad genes not modify them to make them good genes. As you can imagine this is very limited in scope when it comes to diseases that can and should be reasonably targeted using genome editing.
To date all human clinical trials involve gene editing technology whether CRISPR or otherwise have failed to show any change in the disease condition. I don’t see this as likely to change. While there might be a disease that can be helped despite the low efficiency of CRISPR chances are that normal gene therapy, that doesn’t edit the genome, will be easier and more successful. There are very few diseases that would require genome editing as opposed to just adding an extra copy of the gene to cells like gene therapy does. It’s not all just conjecture either, just recently in August 2020 the pharma giant Abbvie ended a partnership it had with Editas, one of the major CRISPR players. Apparently, I’m not the only one who sees CRISPR's future in the clinic as limited.
So what applications are left for CRISPR besides contributing to research? Some people are betting on diagnostics, using CRISPR’s ability to target specific sequences of DNA. While this seems reasonable it is unlikely that tried and true methods for DNA detection that use PCR will ever be significantly deplatformed. After that we are scraping the bottom of the bowl of guac.
I have been around CRISPR since near the beginning. The only thing that has remained constant is the hype. Even that has been fading. While it is hard to measure hype Google Trends indicates that for 2020 the topic and search term CRISPR is on track to be the lowest searched since 2016. We now know that CRISPR gene drives don’t really work. No success for CRISPR in the clinic. CRISPR has already been used to edit human embryos. Really, the only thing keeping CRISPR hype alive is probably the MIT Tech Review.
In 2006, RNAi gene silencing was given the Nobel Prize. MIT called it the breakthrough of the decade. I remember everyone being so excited about it! It was the hot topic at conferences and even my graduate school interviews. While RNAi was and always has been a great benefit to researchers its actual application has been extremely limited. It’s taken 13 years from the RNAi Nobel Prize to bring something to the clinic and even then the two drugs have been a bit underwhelming. According to Google Scholar, papers even mentioning RNAi have been on the decline for the past 6 years. The drug approved in the past few years haven't even slowed the decline.
I am sure people will continue to use CRISPR for years to come but I hate to break it to you CRISPR is dead.
Thursday, June 25, 2020
Saturday, April 25, 2020
In Oakland, CA where I live there is a mandatory order to wear masks when in public. This is despite as recently as March 31, 2020 the World Health Organization(WHO) has suggested people not wear masks to help prevent the spread of coronavirus. On April 6th, 2020 they released interim guidance suggesting that people in the community should wear masks, though they are not effective alone. The CDC suggests we should wear masks or rather cloth face coverings and save masks for medical professionals. So what gives? Should we or should we not wear masks?
There have been a number of scientific studies in a number of different live settings that have shown that face masks alone don’t reduce influenza like illness spread in a statistically significant way. A meta study of 15 of these studies also agrees. However, a recent study done in a lab environment disagrees. When there is conflicting information from scientific papers and organizations how do we decide who is correct and who is incorrect?
This is one of the major problems with science today, science is never wrong. You can almost always find a paper that agrees with the point you are trying to make. Glyphosate is a probable carcinogen according to the International Agency for Research on Cancer (IARC) but according to the WHO and European Union it isn’t. Do you believe the WHO over the IARC? Why? Because one seems more reputable? So the one with the best reputation wins? Is that what science comes down to for you?
Who do we believe and who do we trust?
This happens often in science, conflicting information. None of the studies are technically wrong and so it is up to the bias of the reviewer/interpreter/observer to determine what evidence they want to agree with. If we even have evidence enough to disagree.
Models and data analysis are an interpretation of data that is guided by a human. We are left with a simple choice: either believe it was modeled correctly or don’t. How can you say it wasn’t modelled correctly if you don’t even have information on how it was modeled?
In March, The Imperial College in London released a report that said up to 2.2 million people in the US might die from the coronavirus. The CDC ran models that suggested anywhere from 200,000 to 1.7 million might die. These numbers caused mass hysteria and the lockdown of most states. Despite the fact that 200,000-2.2 million is almost about as big a range as you can get in this prediction. Fortunately, we will most likely underperform and have nearer to 100,000 deaths or less in the US(the current number of coronavirus related deaths on April 24 stands at 51,000 with over 26,000 from New York and New Jersey alone, who may either be overcounting or undercounting depending on who you ask). It is unknown whether any of the models took into account preventative measures and I imagine the lockdown has decreased the number of coronavirus related deaths but by how many? In this scenario could the models ever be wrong? If all the deaths stopped after 4,000, like in China, would there be anyone who would say these models were wrong or would they instead point to preventative measures that helped save us from the predictions of the models?
The models can never be wrong. This makes it so the science can never be wrong. I have seen it time and time again. Where scientists _decide_ some scientific publication is wrong and look at it under the most intense scrutiny that they fail to give similarly to papers that support their argument and a _consensus_ is reached. Do we have great evidence that covid-19 is causing a really high death rate? Not really. Testing has been extremely underdone or needs to be tripled to reopen according to the NYT. Why then are scientists trying so hard to dismiss studies that suggest the number of infected people is much higher than by what testing so far in the US has been done? They really don’t know if these studies are incorrect as evidence for a high death rate is based on an extreme lack of testing. Still, they have decided a low death rate is incorrect and so that is what we are to believe.
The powers that be from on high have decided that this science is correct and this science is incorrect. The problem is that there is no empirical way to decide whether one piece of science is more correct than a piece that disagrees. It requires humans to judge and humans are fallible and prone to bias. Here is where modern science breaks down. It is more about social and political posturing to achieve the most number of people to support your argument than it is of empiricism. If you disagree with the consensus you are automatically wrong. I mean, consensuses have never been wrong before amirite(see sarcasm).
If you still operate under the idea that science delivers us truth and facts I beg you to reconsider your position and understand that beneath the peer review, the data collection, the choice to publish what experiments and what to leave out, is a whole pool of bias and misinformation. Scientists are just as biased as everyone else, maybe more so, because they have their reputation on the line. But we put them up in the heavens and pretend that they are presenting the truth to the rest of us. When they have no more access to the truth than anyone else.
Science might be doing us more harm than good but it depends on who you ask.
Wednesday, April 15, 2020
It's not funding. The NIH alone invests $41 Billion in medical research a year. The top 20 Pharma companies invest around $100 billion. China, Europe and Japan add near another $100 billion. Bringing the total Biotech R&D investment per year to somewhere near a quarter of a trillion dollars.
Honestly, I imagine it is because there just aren’t enough scientists to go around.
Wednesday, April 1, 2020
Regulation is killing us, literally. As we don’t get the drugs we need to help us fast enough.
Friday, December 13, 2019
Agar plates are a staple of most labs.
They hold moisture well and can have a reasonable distribution of nutrients, chemicals and antibiotics that you may need to grow your organisms.
The problem is for the uninitiated, agar plates aren't intuitive. Without an autoclave, melting all the agar can be a strange process of uncertainty. Streaking out organisms is a mess and the agar is usually so soft most people starting their forays into genetic engineering just destroy the plate.
I have been thinking alot about how to make genetic engineering and lab protocols more intuitive. User interaction design is an important characteristic of making a technology widespread. And let's be honest science is not designed with the user in mind. Have you ever given someone a pipette for the first time and seen them try and use it? lol Fuuuuckkkk you science.
Science is so poorly designed that even PhDs struggle to learn and use new equipment, techniques and protocols. This is compounded by the problem that scientists are notoriously against change. If it works it doesn't need to be changed because generally scientists want to focus all their efforts on doing experiments so they can publish papers. There is no incentive in science to design better ways of doing science.
Running The ODIN, I get to see where people struggle the most when trying to learn genetic engineering and one of the biggest problems is making and using agar plates.
Agar is not expensive so I'm not looking for something cheaper. If you are still using molecular biology grade agar I applaud you for being a dumb ass. You can purchase 1kg of agar on amazon for $40 or 1kg of "molecular biology grade agar" for $146 both work exactly the same.
What you really need is something that is easy to use, it doesn't shatter into a million pieces if you touch it and is intuitive for most people.
To me alternatives would be something people are familiar with. That we use in our everyday life. I'm just winging it here and not proposing these as alternatives but trying to think differently about science.
I chose a bagel
I wanted to see if I could grow some genetically modified bacteria and propagate the cultures. I know this seems insane and I think I went off the deep end on this one but I think that is sometimes what is required.
First, I wet all the bagel with either water and ampicillin or LB ampicillin (LB is a nutrient media that helps bacteria grow). Using antibiotics like ampicillin is a standard method in genetic engineering to select for the genetic modification. I took some ampicillin resistant GFP engineered bacteria and streaked it on each of the bagel pieces. The fluorescence from GFP makes it easier to track the bacteria through the experiments. Streaking bacteria on a bagel is tough as the surface was soggy and uneven and inoculation loops aren't really made for that. Still I managed to get a little bacteria on.
I put the materials in ziplock type plastic bags. I imagined this would hold the moisture in. I then incubated them at 37C overnight.
The bags did not hold moisture well and the materials seemed to dry out a bit. I think this will be one of the most important things to think about in the design.
Though the light conditions of this image are not comparable to the original taken before the experiment it definitely seems like there is less fluorescence than when I started, I imagine because the bagel dried out. ugh.
Still, its seems like the bacteria survived and so I decided to break off a piece of bagel and use it to inoculate some liquid media and see if I can propagate the culture.
After 48 hours of having the fluorescent bacteria on the bagel there doesn't appear to be contamination of any kind and the bacteria are still fluorescing.
I don't think bagels are the future of genetic engineering but I don't think agar plates are either.
Most everything that is done in genetic engineering and molecular biology is a complete kludge. The fact that I need like 5 different pieces of equipment to put DNA in bacteria is insane.
Am I really insane for doing this or are the people who continue to do science in such a kludgey way insane?
Tuesday, December 3, 2019
Human cells have long been assumed difficult and technical to culture but recently I have built out protocols and a class that allows people to culture human cells with minimal equipment and experience in their own kitchen even. I am constantly trying to use the most simplistic and cost effective techniques and materials to lower boundaries in science and thus increase innovation. I want anyone to do experiments with human cells if they want. This adds value for all humans not just Biohackers.
What if all you needed to grow human cells was yourself? Your own blood.
Generally, when growing human or animal cells in culture the liquid media used to grow the cells contains Fetal Bovine Serum (FBS). Fetal Bovine Serum is literally the serum(non-red blood cell part of blood) from fetal cows. After a cow is slaughtered they remove the fetal calf and puncture its heart to extract the serum before they kill it. It is pretty barbaric and doesn't scale well. I mean unless of course we want to farm fetal cows for the sole purpose of extracting their serum?
Why do scientists use FBS instead of serum from other animals? This is not really known. Most people say it is because FBS contains less antibodies than other serums and so is less reactive to the cells in the culture but I am skeptical as to how many animal serums have actually been tested head to head. Scientists do this often where they do something only because other people did it with very vague reasoning why. Scientists tend to be very skeptical of change. In fact, there are serums from adult Chicken, Goat, Horse, Pig, Rabbit and many others. You don't read about these often or at all in the modern scientific literature.
I wondered if fresh serum from my body would work in human tissue culture? I am a trained phlebotomist(true story) so I decided to draw my own blood and use a centrifuge to separate the red blood cells from the serum.
The hardest part is always drawing my own blood. I usually use my median cubital vein on one arm and draw the blood with one hand but thinking about it now I should probably use a vein on my leg so I can use both hands at the same time, oh well, next time? From that blood draw, I obtained around 6mL of my serum. Probably about a third of the blood was serum so if we use the maximum blood donation number of 500mL you can get ~150mL of serum from yourself in one sitting. I should probably say don't try this at home. Drawing your own blood isn't the easiest and you can hurt yourself. Again, I am a trained phlebotomist.
I wanted to grow up some human cells with the serum but I wanted to compare it to FBS and Newborn Calf Serum (NCS), which as the name implies is from newborn calves(I don't know if any calves were hurt in the salvaging of that serum but I assume not). NCS is much less expensive monetarily and much less expensive morally and ethically.
This is pretty fucking cool. Run out of FBS and in a pinch you can use your own serum. But seriously this experiment is more to prove a point.
The biotech and science industry is truly fucked because everyone is doing what everyone else is doing and hoping that somehow their dogged go-getty attitude is somehow going to lead to innovation. What people need to be saying is "Fuck what everyone else has done. Maybe they just did what was readily available to them. I am going to figure shit out instead of copy for a vague reason so I can do my research as fast as possible to try and publish a paper no one cares about."
The explosion of human tissue culture and lab grown meats cannot rely on FBS forever as there is an extremely limited supply that cannot scale. A whole company or industry could be built around supplying a low cost scalable FBS alternative that works just as well or better. Fetal Bovine Serum alone has a market size of $700 million. This will only grow.
Maybe growing human meat using human serum is the answer or maybe that is just my next project.
The nitty gritty...
For working in human cell culture I mostly focus on HEK 293 cells. This is a human embryonic kidney cell line that was modified to be "immortal". These immortalized cells are robust and are great for people to learn cell culture techniques. Obviously, conditions that can be applied to HEK293 cells cannot be applied to _all_ cell lines. However, I think it would also be foolish to think that they can't be applied to many other cell lines.
I culture human cells in a non-CO2 environment. This is because you don't actually need CO2 for tissue culture. The reason people use CO2 is mostly because they use DMEM which contains bicarbonate and CO2 is required to buffer the media. Honestly, I don't get it? Still trying to figure out why people use this contrived method.
I use L15 media. This media is not buffered with bicarbonate so no CO2 is required. In the media, I generally use Ampicillin (100ug/mL), Streptomycin(100ug/mL) and Gentamycin(50ug/mL). This generally prevents most any bacterial contamination that can happen in a non-sterile setup. Yes, that is correct. No sterile hoods or sterile areas are used in my human cell culture. With a little experience and by sterile filtering(0.22uM) the media I rarely experience contamination of cultures.
Briefly, HEK293 cells were grown to confluency in L15 with 10% FBS and the above antibiotics. The cells were washed in PBS and then incubated in a 0.25% Trypsin 0.02% EDTA solution for 5 minutes to removed the adherent cells. The same volume of cells were then added to each well. Each well contained 10% of each serum plus L15 and the above antibiotics. Cells were incubated at 37C with no CO2.
Wednesday, August 21, 2019
In genetic engineering when scientists modify bacteria or yeast they use antibiotic selection. This means that they give the genetically modified bacteria and yeast antibiotic resistance because it makes it easier to tell which were engineered. Organisms that survive the antibiotics were most likely engineered. This is not always the case, contamination, escapers and natural mutations can give false positives. People doing genetic engineering for their first time experience these issues much more than a seasoned experimenter and so it is important to know what to blame so you can get the experiment correct.
Media is the term used to describe food organisms eat to survive. It basically contains sugars, nitrogen and other macro and micro nutrients. Generally, media is heated to sterilize it, you don't want random bacteria to grow in your media and ruin your experiment.
In a professional lab environment many scientists will use an autoclave which heats to 121C and 15 PSI. While people doing experiments in a more modest setting will use a microwave or an oven which can only go to ~100C before the liquid boils over. In most cases 100C is sufficient to sterilize media. In fact, in many cases not heating and just adding antibiotics is more than enough to sterilize media over the course of a 2-3 day experiment.
When making media scientists wait until after the media cools to add antibiotics. This is good practice. If you can wait 30 minutes no harm is done by adding the antibiotics at a later time. However, this action has led many people to believe that heating antibiotics in any way will destroy them. In fact, it is what I was taught. Only add antibiotics when the media cools to below 50C.
Because I am lazy and always try and do things different than the establishment I started adding antibiotics to my media before it cooled a long time ago and have rarely or never had problems.
I never did a head to head experiment though. I never compared some heated media to non-heated media to media without antibiotics. So I decided to give it a go.
I heated up LB Agar in a microwave and added antibiotics at ~95C.
I used standard working concentrations for bacteria
Kanamycin - 50ug/mL
Ampicillin - 100ug/mL
Chloramphenicol - 35ug/mL
Streptomycin - 100ug/mL
G418 - 200ug/mL
The length of time each antibiotic was at >90C was 5 minutes. The media was allowed to cool at room temperature so the agar plates could solidify.
I also did
LB Agar with no antibiotics
Ampicillin 100ug/mL added at ~50C
I took a tube of DH5a E. coli bacteria and grew to OD 600nm 0.6 in SOC and then plated 10uL - 4 times on each plate. I let all the plates grow overnight for 18 hours at 37C.
As you can see from the plates there is clear growth on the LB Agar plate that had no antibiotics added and there is no growth on the other plates whether the antibiotics were added at >90C or 50C. The antibiotics were not destroyed by heat at least not enough to prevent bacterial growth.
After ~40 hours there is still no growth on any of the plates but the LB Agar without antibiotics plate has some random contaminating strain of bacteria growing
The antibiotics seem to be working fine.
If you don't believe me try the experiment yourself. It is fairly easy to perform.
I am not saying "no portion of the antibiotics in the media were destroyed". What I am saying is that it is safe to heat antibiotics and still have enough of them leftover to prevent standard lab bacteria and contaminating bacteria from growing, which is their purpose in this case.
I am not saying this method is the way everyone should make their media. What I am saying is that if you do heat your media with antibiotics in it you are ok and it won't ruin your experiment.