Flow Cytometry webinars

View webinars as well as full transcripts from researchers on flow cytometry topics including experimental setup, compensation, and data analysis. We've also got webinars presented by our experienced technical support scientists that provide tips and tricks and offer guidance for selecting the best products to help you achieve optimal results.

Recorded flow cytometry webinars

More parameters, more informative phenotypes: Systematic phenotyping approaches

Join Herve Luche to find out about cutting-edge phenogenomics and how the Invitrogen Attune NxT Flow Cytometer with Autosampler is helping accelerate research by unlocking new discoveries.

00:00:00–Slide 1

Hello, everyone. It is my pleasure today to introduce Dr. Herve Luche of the Centre for ImmunoPHEnomics. Herve studied developmental immunology in the lab of Bernard & Marie Malissen where he developed a system combining lineage tracing and multi-parametric flow cytometry to elucidate the molecular signaling underlying lineage decisions of immune cells. In 2012, Herve joined the Centre for ImmunoPHEnomics in Marseille, France. With its cutting cutting-edge expertise in mouse genetics and immunology, the Centre for ImmunoPHEnomics aims to develop and analyze in a massively parallel and standardized mode, genetic mouse models, to understand the function of the mouse immune system under physiological and pathological conditions. Herve is the scientific director and R&D manager of the immunophenotyping module at the center, where he drives the development of new phenotyping assays and analysis pipelines such as a new, high-content flow cytometry panel of immune cells at basal, inflammatory, or infectious conditions. He contributes to national and international phenotyping efforts, such as the International Mouse Phenotyping Consortium. Finally, he is leading the mouse cytometry effort at the center for use in functional phenotyping. We are excited to have Dr. Luche present to us today a webinar titled: More Parameters, More Informative Phenotypes: Systematic Phenotyping Approaches. Thank you.

00:01:27-Slide 2

Thank you. And hello everyone. I will present to you how we actually work at CIPHE and also how we use the instrument, the Attune NxT, in that context.

00:01:36-Slide 3

So, first of all, the aim of the Centre of ImmunoPHEnomics where I’m working is to actually understand better the immune system. So, that’s our subject of interest. It’s actually highly complex because there are lots of cells that need to interact together to actually fulfill the function of the immune system. So, you need to understand and tackle this complexity at first.

00:02:07-Slide 4

We are doing that in the mouse with the idea that we can transfer the information that we will gain on humans for better understanding also of the human system.

00:02:37-Slide 5

Because of that, we actually mostly rely, all of us, on the information that has been published and that is present in the literature. Most of the time, mouse studies are actually being performed by individual labs and it has been shown that the data that is coming out of these studies sometimes is hard to reproduce because most of the time there is a low statistical power due to not enough replication of the work.

00:03:01-Slide 6

That’s why at the Centre of ImmunoPHEnomics, we try to approach the problem from a different angle and use systematic approaches to actually be reproducible and be able to compare results between themselves.

00:03:16-Slide 7

I’m working at the Centre for ImmunoPHEnomics. You see where it is located.

00:03:23-Slide 8

Here, I’m working in the module of immunophenotyping. The idea is really to blend several things. So, one is sensing mouse models with either the disruption and destruction of the gene by knockout but also, we could study reporter lines, like knock-ins, fluorescent reporters, combine that with instrumentation of IN instrumentation, and then blend that also with bioinformatics, analyze pipelines, to be able to fish-out and perform first high-content analysis and then fish-out immunophenotypic signatures that we could correlate to either a new therapeutic agent, or the function of a gene, or a projection cascade, and so on and so forth.

00:04:12-Slide 9

For this, what we have achieved is a suite of instruments of different kinds. We have conventional flow cytometers like the Attune NxT, but also an instrument from BD, the Fortessa Canto. Then, we have a spectral cytometer, the SP6800 from Sony and the mass cytometer. Lately, we also invested in the 10x Genomic technology to be able to do the commune - to be able to do single-cell genomics.

00:04:43-Slide 10

For that, we have defined high-content panels that allow us to provide the whole mouse immune system in different organs and looking at different cell types by conventional flow cytometry.

00:05:00-Slide 11

Obviously, the star of today is the Attune NxT. We have been here early adaptors, so we started with the legacy instruments that are only two lasers, six colors. Then, we upgraded this to the Attune NxT, again two lasers, eight colors, and recently, we took the V6 version of the Attune NxT which is a three laser and a 12 color-instrument. I’m going to present results taken out of this instrument.

00:05:34-Slide 12

The purpose of what we do, with the help of these different panels, is to capture some images in our control mice and then compare whether these images are present also in mutant mice. For example, you see that for these mutant mice, we didn’t capture any difference using these four panels. Sometimes we do with one of the panels capture one difference and sometimes with all the panels we capture differences. Really the idea of immunophenomics is to be able to find what the cells are that’s important for this change of image in one of the mutants we are studying.

00:06:14-Slide 13

Besides this, it’s very important - and what’s really important for us to also add a pipeline that allows us to analyze lots of samples from lots of mice at the same time to minimize variations.

00:06:30-Slide 14

So, with this increase of throughput, we actually also combine that with different disease models to actually pressure the immune system and see the response of it. It can be to some inflammation, so we have some assays; peritonitis, cytokine storm or IBD and so on. We can also challenge the mice with tumors, so we have syngeneic tumor models on B6, and basic background. We’re also starting that on humanized mice with human tumor cell lines. We can also challenge the animals with infectious agents.

00:07:20-Slide 15

Obviously, when you do these kinds of studies and you want to have a comprehensive view of how the immune system is actually coping with these challenges, it’s interesting not only to look in blood but also look at different organs at the same time. So, that’s what we are aiming. For example, in immuno-oncology pipelines, we might want to study the tumors but also infiltrating immune cells within this tumor or the draining lymph node and the cells within this compartment.

00:07:54-Slide 16

For this, that’s a real-life pipeline that we can perform at CIPHE. You see that we can look at the same time in peripheral organs for example in the spleen, orientation panel that also describes all cell types, or add some panels that are more focused to a given cell type, for example the T-cell memory panel that focus T-cells, we can look in blood at the different types of cells. Also, for example, in the tumor infiltrating leukocytes extracted tumor cells, we can look at by mass cytometry for example, for functional status of T-cells extracted from the tumor environment. Also, we can look at the same time on the spectral cytometer the tumor cells and profile them.

00:08:45-Slide 17

Obviously for this, to be able to cope with lots of samples, we have to dimension our pipeline. We actually have some machines to actually extract organs. That’s the one from Miltenyi. If we need to enrich, we also have some Mitenyi tools to do that by magnetic selection. Then, we use our Fortessa for actualization.

00:09:12-Slide 18

We have few limitations to this. It’s mostly constrained by the time that this is explained on text. It’s really a group work, so many, many application specialists are working together to perform these big experiments. It’s up to 48 individuals, five panels we can run on a single day on these 48 individual samples, and only one mucosal type at the moment because it takes more time to capture this.

00:09:40-Slide 19

So, I think I said this now, so that it works, you really need to find an immunophenotype and try to actually remove the noise from the experiment. That is what is depicted here in red. We really wish to remove the most of the noise that is coming from the experiment to actually keep only the green signal that is the one that is correlated to the mutation that we want to study or that is correlated to the therapeutic treatment we are testing.

00:10:07-Slide 20

So, it’s really important to neutralize variations that are coming at the operator level or by automatizing some steps like cell extraction. I showed you also cell counting at the reagent level, so you just need to titer for example the antibodies you are using to make sure that they always perform in the same manner; neutralize variation at the instrument level, this you can do with using a set of beads to ensure the sensitivity is always the same. That’s something that has been long maybe put aside, so variation at the data analysis level. This is actually now fixed at CIPHE using supervised automated gating.

00:10:53-Slide 21

These are just pictures which show this works, so from cell extraction to cell counting. Cell counting is performed on the Attune NxT. To manage sorting, acquisition, and automated analysis, all that is done following a standard operating procedure to actually be standardized and be able to compare results later on. We are looking into actually pipetting solutions for our own antibody panels, and also solutions to prepare a cocktail of antibodies and store them over of time because we see that still some variation is coming from there and that we need to neutralize it.

00:11:34-Slide 22

Now, the Attune NxT, because that’s the star of today.

00:11:40-Slide 23

How do we use it at CIPHE? First of the thing, I wanted to tell you that we actually did not take that for granted. When we looked for a solution, we compared that with different instruments. In our hands, basically the Attune already the legacy was particularly adopted. Why that? Because it allowed us, with the help of the Autosampler, to acquire very fast number of samples and determine the absolute count. That was point A. Point B is that it was very hard to clog the system mostly due to the acoustic focusing, so making this solution for us a very robust instrument in the way we wanted to use that.

00:12:27-Slide 24

I talked a little bit also of the Autosampler. For us, it’s the most robust combination, so the HTS system and the Attune NxT to actually acquire fast and count samples. We also are very pleased by the fact that this Autosampler accommodates deep well. That allows us basically, and we will see that in a minute, to count on brute extracts, so raw extracts, of cells. So, without treatment, we just extract the cells, digest tissues, extract the cells, and directly count from that on dilution of that. It allows us to minimize the steps until we get the count of cells.

00:13:12-Sllide 25

Volumetric counting, as you’ve heard, we are really a fan of that. What we do actually is, so it needs a little bit of assay setup and you can look at it by using this plot, so when you draw a gate at the beginning of your acquisition and at the end of cell acquisition. Why that? It’s because the Attune NxT drives the sample through a syringe pump and what you see when you actually push the sample into the machine, you push that with some sheath at the bottom at the end of the sample. If you count in this area where the sheath is pushing the sample, you will have some dilution effect and your counts might be naturalized. So, it’s only a matter of actually setting up your assay at the beginning from the sample types you are interested in and make sure that you always have a stable count between your beginning and the end of the acquisition, so that you actually have - do not underestimate the number of cells you have. Once you have set it up for your particular assay, you can use that pool over and over again. That’s what we do at CIPHE.

00:14:30-Slide 26

Just to show you here that we actually, each time we do a counting experiment, we validate our instruments, we ensure that the instrument counts well. Most of the time, the instrument counts well is just to make sure that the fluidics of the instrument is clean for this kind of assay. That’s what we do by using counting beads, the CountBright beads from Thermo. That should always give roughly the same count, so we make sure this is right. If the count is not correct, then we wash the instrument, we clean it, and then we pass the counting beads again. When we get the good count, most of the time from the start it’s already good, then we can start counting our cells in our samples.

00:15:18-Slide 27

Here it’s just what we’ve been defining for our own tissue types, so spleen, thymus, lymph nodes, how we should dilute the raw extraction of cells so that we get a good count of cells, and within the specs of counting which is actually described in the white paper of Thermo about absolute counting using the Attune NxT. It’s basically following these rules, so the count should be at an average speed of 50-110 events per microliter. Then, it’s very handy because you have a statistic embedded within the Attune NxT software, the events per microliter, you can multiply that by your dilution factor and then you get the number of cells you have in your sample.

00:16:14-Slide 28

Now, how do we use the Attune NxT and this counting capability, and why it’s so important for?

00:16:22-Slide 29

I will talk a little bit about mass cytometry then. Mass cytometry, it’s basically the same as flow cytometry at the beginning. The only change I would say is that you stain cells with antibodies that are coupled to metal and not fluorochromes anymore. Then, you pass them for this machine one after another. Then, you have a step where you actually put these cells into a plasma of argon at a very high temperature, so 5,500 Kelvin. The result of that is that each cell is actually transformed into an ion cloud. Then this ion cloud, within this we will count the number of elements that are present on to the antibodies. I won’t go into further details on that because that’s not the aim of today.

00:17:10-Slide 30

One thing that we noticed very early on is that if you pass your sample without counting the number of particles you had in it and you look at the patterns of expression for example here, I’m looking at B-cells, so CD19 positive, and I look at a class 2 expression versus CD4. Obviously on B-cells, I have a class 2 expression and no CD4. What we saw in this early experiment we actually sampled where cellularity was not determined was that we saw class 2 expression on CD18. Remember, I’m studying mouse and that’s a real artifact. We had roughly 16% of artifact in this analysis. This is coming from the fact that this ion cloud, they need some space and time separating them. If you pass a sample that is too concentrated in events, what you come to see is a fusion of this ion cloud. It’s similar to doublets that you would see in flow cytometry. So, what is key there is really the sample introduction speed and actually the number of events you have per your cellularity.

00:18:22-Slide 31

This can be very, very drastic leading to lots of artifacts. If you actually gate on singlets or doublets and use a different concentration of your sample, you see that given a threshold, you see that very drastically you increase your percentage of doublets. For example, if you use eight times the recommendation, you are over 25% of doublets within your sample, which is very bad. That’s why we found out that our optimal ratio was 0.5 million per mL and that we should acquire at 300 events per second.

00:19:08-Slide 32

Because of that, what we do now is that each sample is counted on the Attune NxT before we actually inject the sample into the mass cytometer to prevent that we go at cell concentration over our threshold like this and ensure we’d get a very reduced number of artifacts and doublets.

00:19:30-Slide 33

Now, what we also do on the Attune NxT is define our yield of enrichment when we do magnetic selection.

00:19:36-Slide 34

Here is an example. Before magnetic selection in that particular example, it’s a tumor of a B-cell leukemia that we injected under the skin of the mice. What we want to do here is remove the tumor cells and enrich infiltrating leukocytes. What we do first before magnetic depletion, we count the number of cells and the proportion using the Attune NxT.

00:20:10-Slide 35

Then, we do our magnetic, here it’s a depletion, we want to remove the B-cells, so the cells that are labelled with CD19 coupled to magnetic beads. We see that after magnetic depletion on the Attune NxT, we can scroll the number of cells that are remaining and then like this, calculate our ratio of enrichment, our yield of enrichment. So, for us, it’s particularly handy because we can calculate this enrichment, we can calculate also the number of beads we need to put in face of our crude extract to actually perform a very good pure enrichment of our population.

00:20:54-Slide 36

This is also very important. If you do that, then you can start working on cells and do phenotyping on these tumor infiltrating leukocytes that have been purified because they are devoid of any tumor cells.

00:21:09-Slide 37

The next point is to me very important especially for flow core facilities that are shifting to single-cell genomics, for example, and also in others.

00:21:18-Slide 38

I don’t know if you’ve been there, so I’ve been actually working in a research institute as a post-doc before, so I know both situations. Now I am a director of this core where we also have people coming and sort with us. Most of the time, you can see this kind of situation where the scientist is not happy because he is coming with a sample of 500 million cells that are really precious to him and after the sort, he only got out 30,000 cells. Then, he blames the core facility because he thinks the sort was not well setup by the core facility people. Then, the core facility people says that most of the time the counts are off and completely wrong and that’s actually what the sorter sees is a lot less of these cells, and that among these cells, 80% were dead and that’s because the cell prep was bad, that at the end of it, the cell sorter could not do magic and that very few cells were coming out of this. So, we actually - you need a judge of peace for that to actually see who is right and who is wrong, and what we do is that we use the Attune NxT for that. When external people come, we make them count the number of cells in the sample that they bring to the facility and then we also check the viability of that. We then believe the values that we get out of the Attune NxT because it’s done in the facility and both core lab and scientist are actually looking at this result, then we have a far better fundamental for the cell sorting, then it goes very well.

00:23:03-Slide 39

Another situation where it’s very important to very well characterize the sample so that we’d be more after the cell sorting experiment, we want to make sure that we put, that we have a given number of cells within our sample. This you can do that also with the Attune NxT. This is particularly important when you want to use these cells for single-cell genomics. For example, CITEseq, so here is a scheme of this where you actually are able, with this technology, to combine antibodies coupled to oligonucleotides to fish-out proteomic information. This combined with the sequencing of the transcript, you can now look at both transcriptomic and proteomic at the single-cell level.

00:23:53-Slide 40

The only issue with this technology is that it’s super costly. So, I have put here the price that we get in France for the different steps. Basically, because of the price and also the technology where you actually at maximum can study 20,000 cells per sample and this being really expensive, you better be sure that you put in your assay your 20,000 cells and that they are all in good condition, in good assay condition, because in this method, if you put too much, it will directly impact the sequencing depth and the number of genes that you will look at. If you put too much cells, you see less genes because overall, you will distribute the number of you have for determining your transcripts. So, you need to be very precise there and to optimize also cost.

00:25:00-Slide 41

That’s why we did that on the Attune NxT. We had to adjust a few things. First the sorting in this given project. We actually used the Attune NxT to be able to sort enough cells to get 20,000 cells per population.

00:25:17-Slide 42

Here, it was a complicated experiment we wanted to look at. Then, there were several steps of optimization. We actually concentrated more the samples and all that was actually, each time each step was done by counting on the Attune NxT the number of cells we got. Then we could determine the sorting need also with the Attune NxT to be able to find the best compromise between number of cells, speed of centrifugation, time of cell sorting to actually ensure the cells were on their top conditions for downstream single-cell genomic assays.

00:26:01-Slide 43

That’s basically here just depicting the different steps that we followed. We found out that 2,400 rpm for centrifugation after sort for five minutes let us recover 68% of cells that we initially put after cell sorting. So, we went on for this for single-cell genomic assays.

00:26:26-Slide 44

A few words I will touch now on rational panel design, and particularly on the Attune NxT, and the methodology.

00:26:38-Slide 45

I told you our main activity is phenotyping mutant match, mice is actually wrong. We do “painting” every day. Why that? It’s because we design panels to be optimal for a given instrument to achieve the best resolution of our populations. We have in the lab 2,500 different antibodies or “paint tubes” if we make this analogy. This comes with different flavors, different colors; so around 30 in flow cytometry and 50 in mass cytometry. Then, we have different kinds of pencils which would be an analogy for the detectors; 18 pencils of different sensitivities which would be the size of the pencils basically, both in flow, spectral, and mass cytometry. So, when you want to design a panel, you need to take into account all these and we try to approach that in a rational manner to minimize the time that we spend in designing these panels.

00:27:43-Slide 46

The gain is really to find CD markers first that will decipher for the population of interest.

00:27:52-Slide 47

Once we have that, also to see in which instrument we can run panels or what we want to do. Because of that, we need to know very well our instruments. First step within panel design is actually the instrument characterization step. Then, what you need to do, because you want to match the brightness of a dye with the level of expression of your molecule of interest for your given panel, but that means you need enough prior knowledge about what is bright and not on your instrument. Also, very importantly, what came recently is that you need to also look at the spectral overlap of your dyes and the spread that is also resulting from the spectral overlap and the spread data. This is also something that you will see depending on your instrument’s optical configuration. So, you need at first to define that on your instrument. Once you have that, you also need to know something about your biological sample; what are your populations of interest, what is the density of expression of this antigen, and whether the antigen you want to look at are co-expressed or not on the given cell type.

00:29:01-Slide 48

Again, first step is instrument characterization. You have to assay for the sensibility of your instrument and find out the right voltages to use for your detectors.

00:29:15-Slide 49

We’ve been developing sort of as we go essentially in the lab has been developing a stepwise evaluation of instrument performances and then I will talk about it today because we did that on the Attune NxT. Actually, this pipeline is completely inspired by the paper of Stephen Perfetto that I just put here in the notes on the side and I really encourage you to read. It’s very nice for this kind of work. We just followed that and we actually characterized several instruments also based on this pipeline.

00:29:53-Slide 50

First thing, you need to know the optical configuration of your instrument. Here we are on the Attune NxT V6. So, you have tables that tell you what kind of fluorochromes you can look at, the laser power, and so on.

00:30:10-Slide 51

Then, the next step you need to do is to define a voltage for which you would be over electronic noise. Before, it used to be done using eyeballing. Actually, this is not so good by putting your cells actually on your logarithmic scale on a given position. This should be stopped because the more colors you have, the more problem this method is actually bringing to you. The only idea that you should follow is try to put your cells over the electronic noise of your system for a given detector and then also never acquire signals that are over the maximum lineage of your detectors. So, when you characterize your instrument, you need to find ways to actually look at electronic noise and define what its maximum optimality is and work within this interval.

00:31:12-Slide 52

So, the way we do ensure whether we are within this good interval for measurement on each detector is that we use rainbow beads. On these rainbow beads, we look at the different statistics on different peaks. We take the rCV of the fourth peak of the rainbow beads, and then we look at the MFI of peak five and peak three. For this, we determine the linearity range which is calculated using this formula; MFI of peak five minus MFI of peak three divided by the MFI of peak three. That gives you values that will show whether you are linear or not across that range of fluorescence. Then, the linearity max is actually being looked at by looking at peak seven and seeing when the signal is stable. For example, whether it’s stacked – sorry, when the signal is increased - when it’s actually stacked, that means that you reached the maximum of your linearity. Then, we run these beads across different gains, different gains for APs or volts for PMTs, with the so called voltration. Then, we see the evolution of this statistic. You see that, for example, the rCV is very high and then when your voltage is high enough, then it’s stable and remains flat. That’s actually when you are over electronic noise. Now, when your linearity range, so the blue line, is stable and then starts to fall, that means that one of your peaks is actually out of the measurement scale. That’s actually you pushed too much your voltage. Then, you see also that linearity max is increasing upon voltage increase, and then you will reach a plateau. That’s when you actually, your peak is not preserved anymore, so you’ve reached your maximum of linearity. So, the window of work that we can work on the given detector here is boxed with red dots. That’s how you actually define what you can use, the voltage range or the range of interest you can use for your given detector.

00:33:39-Slide 53

By this, we run it throughout on all of our detectors on the Attune NxT and we define the good interval for work, and we define this minimal voltage for which we are over electronic noise.

00:33:51-Slide 54

Then, what we do, we try to define what is bright or not.

00:33:56-Slide 55

For this - this is the work of Theo Loustaud in the lab. Then, it’s always the question of what is bright and dim. In this, we see that for different dyes that you could detect in a given detector, we see that the brightness can be very different depending on the dyes. So, it’s very important to actually check on that but you need some previous knowledge of the dyes, and you never know about the instruments. That’s why you need to do that once on your instrument to characterize it. For this, we have been - but it can be very labor intensive to define this brightness and fit it with the molecule you want to look at.

00:34:45-Slide 56

For this, we tried to standardize a bit and do one very big experiment. That’s how we established our brightness index of different dyes. We take our splenocytes two million and we put them two million cells per well, and then we stain them with anti-mouse CD8, all the same flow, the only we’ll change in the same quantity, 0.4 microgram - the only thing that we changed is the fluorochrome to which these antibody is conjugated. Then, we do our staining in the same staining conditions, so in 100 microliters for all the conjugates. So, on the Attune NxT, it’s the 35 conjugates we’ve been testing and that we are fitting our optical configuration. Then, upon staining we wash, and then we acquire on the Attune NxT all the single-stained samples.

00:35:40-Slide 57

The other thing is that we acquired that at three different voltage. The base, the minimal voltage, the base PMT voltage is actually the one that we defined using the rCV method to be over electronic noise. Then, we increased by 50 or 100 volts. This is just to see that whether our cells, in this minimal voltage, are actually the optimal ones for the resolution of dyes. Then, obviously we select the different dyes that fit on our cytometer. This, I mentioned that already.

00:36:15-Slide 58

Then, to define what is bright or not, we used the stain index. Here, it’s plugged, the formula of it, and then you can actually derive this bright index.

00:36:31-Slide 59

Here, it’s just showing you the values that we got for the basal and then we look whether we see an increase at 50 volts or plus 100 volts. We see that most of the time the basal voltage is good but for example for BV510, we saw that we had to increase by 100 volts to find the best separation between negative and positive.

00:36:54-Slide 60

So, you need to do that. You can do that with lower value to ease a bit your comparison with normalized stain index where you actually take the brightest or the higher value as 100% and then you compute everything towards this.

00:37:10-Slide 61

That helps you to find what your optimal PMT voltage is. You see that for example most of it, it was the minimal voltage but sometimes we had to change it for given detectors.

00:37:23-Slide 62

Once you have done that, you run again your bright index with the optimal voltage and you can define this brightness, this bright index. So, from that, on the Attune NxT, we could see that BV421 was the brightest dye in our hands, but that all of that – I mean, most of them were Super Bright dyes, so Super Brights are already super bright. Basically, this helps you. You don’t really look at the relative brightness to each other but what you need to do is then define categories of very bright dyes, bright dyes, dim dyes, and really bad dyes, to actually then later on match with your antibodies.

00:38:11-Slide 63

Then, the next thing is to look at the spread data which is coming out from the spectral contamination between dyes. We know that dyes, fluorescent dyes, they do overlap especially the peak estimations I am able to find, so there is some light that is coming from one detector that is detected actually, one light, sorry, that comes from one fluorochrome that can be detected in adjacent detectors. This we solve by compensation and distortion. The more spectral overlaps you have, the more spread data this would lead to. This spread data can actually decrease the resolution of your population, so you need to calculate that.

00:38:54-Slide 64

Then, here just an important thing is that spectral overlap is actually independent of compensation. If you keep voltage, because this is really a property of the dye, if you keep voltage constant for the detector in which you will actually compute the spread, and then you vary only the primary voltage on which your conjugate is the best detected, you see that you can decrease or increase the voltage, the compensation will change but the spread data that we see is always the same.

00:39:39-Slide 65

So, the gain is really to find, to compute that first, to be able then to minimize the spread data that will be engineered, and explore statistics.

00:39:51-Slide 66

We use that by using this metrics here that is depicted. So you take the spread value that you compute in the second line detector which is here, for example PE-Cy7, and we divide that by the fluorescent intensity which is actually given by the separation between your positive population, the MFI of your positive population, minus the MFI of your negative population.

00:40:23-Slide 67

That’s very important to do that step because that will have strong implication for later data analysis and especially advanced methodologies. Here you have - actually, we only looked at the single-stain color, BB700 that I read on this instrument here, in the channel PE-Cy5.5. Then, I read unsupervised analysis tSNE dimension reduction. I actually did two of these analyses; one I omitted the BB700 data for clustering, and one I took it. Otherwise, you see the run input parameter are the same between the two tSNE. It’s only that the image is different. On one side, I see two populations. On the other side, I see three with some kind of that we can see. That’s only because I omitted to take BB700 in this. You can see that actually one of the sides of this home is actually the positive spread of the dye and the other one is the lower spread of the dye. So, you really need to be careful, minimize the spread at all cost because if you then in later downstream/advanced analysis, if you don’t use this marker in your clustering, you might create very big artifacts.

00:41:58-Slide 68

Then, you also need to set well your transformation around the zero. Actually, the more spread data you have, the more your negative will be impacted as well in the multidimensional space. That’s why you also need to minimize this because otherwise again that will have impact on clustering methods for example later on.

00:42:23-Slide 69

So, you need to do that. Actually, it’s not something we discovered. It has been published by Enrico Lugli and Mario Roederer recently.

00:42:33-Slide 70

That’s actually, using the bright index, you have the data to actually compute the speed of the spreading matrix with the help of the formula I gave you before. That’s the spreading match spillover matrix that we actually defined on the Attune NxT. So now, we know where these red dots – sometimes it shows where it spreads very badly from one detector to another one, so we know what are the worse combinations that we should not pick for our panel designs.

00:43:06-Slide 71

Then based on that, now I can look for dyes that will actually, that I will be able to put on the antibodies to define these cell types.

00:43:15-Slide 72

Now I am matching the brightness, because I have the bright index on my instrument, I can bright that to the antigen density.

00:43:28-Slide 73

This is still based on the previous knowledge of the scientist but soon, at least for the mouse, the data will be released and will be available because we are currently datamining the antigen density of all the molecules for which BD and Biolegend have surface antibodies.

00:43:47-Slide 74

Then, we can now look from this quantity, see what is high expressed or not and then match the dyes that we will use for discovering this population.

00:44:02-Slide 75

I have to say that very soon, there will be an interface to do these laborious steps of instrument characterization, and bright index, and spillover spreading matrix computing that would be available on our data, so I encourage you to look at that. It’s to help you go faster to that step for your panel design on your given instrument.

00:44:24-Slide 76

Now, what we do, we want to do high-content panels.

00:44:29-Slide 77

Why that? Because obviously we don’t only count on the Attune NxT, we do also some nice experiments. For example, phenotyping of peripheral blood cells, we can do that on the Attune NxT with a very small volume of blood and take advantage of the deep well which actually have good resolution of this staining procedure in total blood. That’s why I put it here, after analysis.

00:45:03-Slide 78

Here what we wanted to do is transfer on one assay, phenotyping assay, on the Attune NxT. It was a 10-color panel, reference panel, that we had and we wanted to transfer that on to the Attune NxT, use Super Bright dyes because we saw they were very bright on our instrument and could yield better resolution, and also integrate new markers for better deciphering the functional subset of T-cells.

00:45:34-Slide 79

With our bright index and the spread matrix, we defined three different panel candidates.

00:45:46-Slide 80

And we could see here that, for example, the impact of dye brightness on markers. So, definitely you can see that some panels are better than others and we basically selected the panel 3 configuration.

00:46:01-Slide 81

Here is just the end of it. I showed you the different rational steps for high-content panels that comes from actually serious instrument evaluation first of the sensitivity of your instrument and the detectors. Once you have that, you can design very quickly high-content panels on your instruments.

So, as a conclusion slide, I can testify that the Attune NxT is a very well-built instrument. Otherwise, we would have shifted from since now because we have been using the legacy and all these updates. So, we are very happy with the Attune NxT. As far as the automated Autosampler, it’s a real plus when it is combined with the acoustic focusing because it allows you to count very diluted samples really fast. It’s very precise, and gives really reliable event counting. We have not seen any carryover with that because each time you actually inject a sample, the sample probe from the HCS system is actually being washed and then is preparing the next sample, so that’s really good. That’s why I think we’ve seen no carryover. It’s very robust and resistance to clogging. If you get a clog, it’s because something went really bad with your sample, I guess, which some instruments are more susceptible to this but the Attune NxT is fine. Then, once you have defined your optimal gain, it’s really easy to use instruments for multicolor applications and hopefully, we will soon release this tool that will help you to do that also very fast and in an efficient manner. Then, I hope also I will have convinced you that when you follow these simple guidelines, you can define high-content panels very efficiently and very quickly.

The only drawback I see is actually coming from the encoding of the data on the Attune NxT is the number of beads we have. It might be sometimes first to actually scale your data around zero to visualize it perfectly but I know that this is actually something that is undertaken right now by the software developer at Thermo to address that point.

00:48:53-Slide 82

With this, I would like to thank everyone that were involved in this work in the lab, and especially Pierre Grenot and Theo Loustaud. Pierre did the instrument characterization work, and Theo Loustaud the bright index and the spread matrix calculation, and then all other people of the team. Thank you very much for your attention and I am open to questions now.

00:48:50-Question 1

Thank you, Herve, for that presentation. Some questions that we have gotten are, “Can you please describe some of your sample preparations of different tissues? So, in general, are there differences in how you prepare different amounts of tissues? Are there any tips that you have to offer about the best practices for doing so?”


Yes, there are differences. Actually, you need to optimize your cell extraction conditions per organ, so that’s very clear. Some tips I should give are actually also to know what the markers you are interested in and check these whether they will stay on the cell upon your cell extraction conditions. For example, you can use proteases that chunk-off all the cell surface molecules. You will have lots of cells at a good viability but then you will strip off the markers you are interested in for use for phenotyping. So, you need to check that. One way to do that is actually to mix a sample you know very well within your cell extraction conditions. For example, we usually do that with splenocytes and then we check during the time of digestion and mechanical dissociation with our other tissue types, with our own splenocytes that we will first label with a given color are found well, and then whether the surface molecules that are on them are conserved or not. That’s one way to check whether your extraction conditions actually don’t trim your surface molecule of interest. Otherwise, look at viability. That’s something you can do with the Attune NxT very quickly.

Then, yes, sorry I did not say that, obviously you would need to change to play on the enzymes you use, the time you apply them, and also the temperature.

00:50:58-Question 2

Thank you. Another question is, “For doing cell counts for single-cell genomics, it’s important to have the correct accuracy. Why is it so important that the accuracy of your cell counts come out correctly?”


It’s because if you put too much cells, you will get out very few genes per cell, that you will be able to detect very few genes per cell. These experiments are very expensive and if you just run them because you put three times more cells than which you actually can on this assay, then it’s going to be very bitter because you will have lost time because it’s a time-consuming pipeline, and money, and a lot of money. So, you really need to account accurately. I can tell you around us, we started to count on the Attune NxT these samples and then because we actually at that time did not provide the single-cell genomic part, so it was done remotely at another site, and then people started to use the counts that were given by the Attune [NxT] because that was leaving better results at the end after sequencing of the encapsulated cells. So, for us, it’s the best way to do this.

00:52:59-Question 3

When looking at the spillover spread matrix, frequently people will do this with FlowJo before but you showed is using Tableau. Why did you choose Tableau and are there other options besides FlowJo and Tableau for this?


Actually, FlowJo is fine. You can do it with it. What we like with Tableau, it’s only for the visualization, so it’s not for the data analysis that we use that. Basically, we read tables in Tableau and we like using Tableau because we can play with the visualization, we can tune it and put several information. For example, in the spread matrix, you saw on Tableau we can dimension the size of the circles by, for example, the stain index, and then on one visualization we have two information; we have stain index and spread, which is two components you need to take into account when you make your choice for picking the right dye for the right antibody.

00:53:25-Question 4

Just a general question. For a researcher who’s just starting out using flow cytometry for phenotyping, in your experience, what have you found to be the most common mistakes that people will make when deciding on a panel and how to run a panel?


Most of the time, these people don’t characterize their instrument. That’s point A. Sometimes they take what is in the fridge, what antibodies are already available. Sometimes they use a panel that works well on one instrument and think it will work well on the next one. I think all of that are very common mistakes that are being done. I’m a big believer of the fact that you need to design your panel for your instrument and that’s the way you achieve the best results because best resolution. If you follow that, and that’s what we do on our several instruments, and hopefully this will be fastened by the application we will provide soon our data, then there is really no reason why you should not rationally design your panels for a given instrument. At the end of the day, you save time and the data is of better quality, so why not do that.


That’s the end of our time for our question and answers today. I’d like to thank Dr. Herve Luche for his time and giving us this presentation. Today’s presentation was sponsored by Thermo Fisher Scientific in partnership with Wiley’s Current Protocols. This webinar will be available for on-demand viewing at the Current Protocols’ website at currentprotocols.com. Thank you for joining us today.

End of Recording

Identify Cancer Stem Cells in Primary Tumors and in Circulation using Flow Cytometry

Join Bruno Sainz as he discusses the advancements in the detection of cancer stem cells (CSC) in primary tumors and in circulation using acoustic flow cytometry.

00:00 – 03:17, Slide 1

Hello, we’re glad you’ve joined us for this live webinar; the use of flow cytometry to identify cancer stem cells in primary tumors and in circulation. I am Judy O’Rourke of LabRoots, and I’ll be moderating this session.

Today’s educational web seminar is presented by LabRoots, the leading scientific social networking website and provider of virtual events and webinars advancing scientific collaboration and learning. It’s brought to you by Thermo Fisher Scientific. Thermo Fisher is a world leader in serving science whose mission is to enable customers to make the world healthier, cleaner and safer. Thermo Fisher Scientific helps their customers accelerate life science’s research, solve complex analytical challenges, improve patient diagnostics and increase laboratory productivity. For more information, please visit: www.thermoscientific.com.

Let’s get started. You can post questions to the speaker during the presentation while they’re fresh in your mind. To do so, simply type them into the drop-down box located on the far left of your screen labelled, “Ask A Question”, and click on the send button. Questions will be answered after the presentation. To enlarge the slide window, click on the arrows at the top right-hand corner of the presentation window. If you experience technical problems seeing or hearing the presentation, just click on the “Support” tab found at the top right of the presentation window, or, report your problem by typing it into the “Answer Question” box located on the far left of your screens. This is an educational webinar and thus, offers free, continuing education credits. Please click on the “Continuing Education Credits” tab located at the top right of the presentation window and follow the process of obtaining your credits.

I now present today’s speaker, Bruno Sainz PhD. Dr. Sainz is a Ramon y Cajal researcher, with co-appointments in the Department of Biochemistry at the Autonomous University of Madrid, the Department of Cancer Biology at the Instituto de Investigaciones Biomedicas Alberto Sols CSIC-UAM, and the chronic diseases and cancer area of the Ramon y Cajal Institute for Health Research. Dr. Sainz obtained his PhD in Microbiology and Immunology from Tulane University and began working on cancer in 2011, after joining the Spanish National Cancer Center as a staff scientist. He now heads the Cancer Stem Cells and Tumor Microenvironment Group, focusing on the study of cancer stem cells, CSCs and the fibro inflammatory microenvironment in pancreatic ductal adenocarcinoma. His lab is running a combined basic and translation research program which synergistically combines studies on the biology of mouse and human CSCs, including their in vivo microenvironment, to enhance our understanding of the regulatory machinery of CSCs. The group is using state of the art in vitro and in vivo systems, along with acoustic flow cytometry to better understand the role of CSCs in cancer. Dr. Sainz’ complete bio is found in the LabRoots website. Dr. Sainz will now begin his presentation.

03:17 – 03:49, Slide 2

Thank you very much, Judy, for that presentation. I’ll be very happy to get started. I’ve set three basic rules for this webinar, as you can see. First, I want to discuss with you all how to identify cancer stem cells using flow cytometry, evaluate factors that should be considered when using flow cytometry to identify cells from tumors and from blood and also discuss sample preparation and choosing the correct cancer stem cell markers to identify these types of cells from blood and also from tumors.

03:49 – 04:17, Slide 3

As we all know, 90% of cancer deaths are a result of metastatic disease. From a clinical or scientific perspective, one could say that understanding these cells or the cells that drive metastasis is absolutely fundamental. Often times, the cells that drive metastasis are in fact different than the bulk tumor cells, so something we always need to keep in mind when we’re thinking about metastatic disease. 

04:17 – 04:33, Slide 4

Now, I think, this point was best exemplified by a study published in 2012 in the New England Journal of Medicine by Gerlinger, et al, in which the authors analyzed and sequenced nine different regions from a renal tumor, including also three metastases. 

04:33 – 05:31, Slide 5

This was a very important study because they found that the tumor itself contained about 128 mutations, but only a third was present in all the regions analyzed. Interestingly, a fourth was found in only one single region. The metastasis appeared to have risen, or been derived, from only one particular region, which is noted here in blue, Region Number four. In general, the evolution of the tumor was not linear at all but branched like a tree and this is something that we typically see in stem cells, which also divide in a very similar manner. In summary, the study is not only important but it highlights the inherent heterogeneity present within tumors and the contribution that very few cells have on the evolution or basic processes of tumor evolutions such as metastasis, here reflected by the Area 4, which is the only region of the tumor that actually gives rise to the metastasis in secondary organ. 

05:31 – 05:57, Slide 6

Therefore, to turn the tide on this alarming statistic of 90% of cancer deaths are results of metastatic disease, we really need to better understand the tumor at the cellular level and also develop technique to differentiate all of the essential cellular players involved, not only in process of tumor formation, but also in the process of metastasis. 

05:57 – 06:38, Slide 7

Now, it’s been for the past two decades that cancer stem cells, or CSCs, play a very important role in metastasis. These cells represent a very rare and small subpopulation of cells within the tumor that are highly tumorigenic and have stem-like properties. To date, they have been identified in just about all solid tumors and they are believed to be drivers of, for example, resistance to therapy, disease relapse and most importantly, metastasis actually go into circulation, which we refer to as circulating cancer stem cells. 

06:38 – 07:41, Slide 8

Now, the cancer stem cell concept, as a whole, is not a very new concept. As early as 1855, there were reports – multiple studies reporting stem-like cells in different cancers. We can think of Rudolf Virchow, and later Julius Cohnheim as the grandfathers of the cancer stem cell concept, which they both proposed on the basis of observed similarities between fetal tissue in cancer. Cancer actually developed in the past from activation of dormant embryonic tissue remnants.  We now know this is now true but they were used for the first studies unlinking a possible link between cancer and embryonic tissue or cancer stem cells. Since then, it took about 100 years to officially isolate and identify cancer stem cells, and this was due to the efforts of John Dick and colleagues in 1994, where they, for the first time were able to isolate and purify cancer stem cells from acute myeloid leukemia, AML. 

07:41 – 09:11, Slide 9

Now, this advanced or very important discovery was based on the availability of two important techniques or methodologies. On the one hand, the availability of immunocompromised mice which is very important for the capacity to implant human cells into a small animal model, and also the advent and the availability of flow cytometry, specifically fluorescence associated cell sorting. Both of these techniques allow the researchers to first isolate the cells of interest and then implant those cells in nude mice to test their tumorigenic capacity.  As you can see here on the right, one of the experiments and figures from their paper, you can see that the authors isolated both CD34 Negative and CD34 Positive AML cells via flow cytometry using antibodies against CD34. You can see very nicely that only the CD34 Positive population are the cells that are tumorigenic or have the capacity to form tumors in immunocompromised mice. They went one step further and also used CD38, and isolated both CD34 Positive, CD38 Positive cells and CD34 Positive and CD38 Negative cells. It was the latter population, the CD34 Positives, CD38 Negative cells that were the tumorigenic cells. This also showed that within the cancer stem cell population, there’s a hierarchy and sometimes multiple cell surface markers need to be used to identify these very rare and tumorigenic cells.

09:11 – 09:36, Slide 10

Since then, cancer stem cells have been identified in just about all solid tumors, as you can see here, in liver cancer, in pancreatic cancer and in multiple other solid organs. It’s important to note that today it is well accepted that cancer stem cells are the root of the tumor and must be eliminated to ensure tumor eradication. 

09:36 – 10:28, Slide 11

Now, cancer stem cells maintain their tumor heterogeneity and they promote chemo resistance and metastases due to their stem-like properties. If we look, and we refer to this as stem-like properties because if we look within these cells, specifically at the molecular level, we often find pathways activated that we tend to find activated and functional in stem cells. If we look within a cancer stem cell, we can find active, hedgehog signaling active, notch signaling, active in nodal signaling active, Hippo pathways active. In these cells, which are pathways that we tend to find active in cancer stem cells and therefore, we often refer to these cells as stem-like cells or cancer stem-cells. 

10:28 – 12:18, Slide 12

In addition, the cancer stem cells reside within a very complex niche within the tumor, much like a stem cell resides within its niche. You can see this here, shown on the left, the tumor niche or the cancer stem cell niche that receives cues or information from its niche. The niche provides information that promotes symmetric division of cancer stem cells to give rise to, for example, transient hybrid cancer cells or they will give rise to a more differentiated tumor cells within the tumor stroma or the bulk tumor. These cells, at one point or another, either the more differentiated cells, the transient hybrid cancer cells or even the cancer stem cells itself can go into circulation. Once in circulation, they can make their way to secondary organs, for example the liver, and establish metastasis in these organs. Now, circulation is a very difficult environment for these cells so they have to, on one hand, survive apoptosis and on the other upregulate factors or ligands or receptors that will make them invisible to the immune system, basically promoting immuno-vation.

It’s important to note, however – or in addition, that the relatively new concept of plasticity when we talk about the cancer stem cell concept, whereby a non-cancer stem cell, for example, can differentiate into a cancer stem cell if the cancer stem cell compartment is compromised. This, in addition, makes this situation or this concept even more complex and targeting, therefore the cancer stem cell may not always be as efficient as we have previously thought, understanding that transient hybrid cells can, at some time or another via plasticity take their place. 

12:18 – 12:37, Slide 13

Thus, since the majority of cancer patients die from metastasis. What is needed, in my opinion, are methods to detect those cells that actually cause metastasis be it either circling tumor cells or, in this case, and specifically for this webinar, the identification and the study of circulating cancer stem cells. 

12:37 – 13:35, Slide 14

In my laboratory, we study circulating tumor cells, cancer stem cells and circulating cancer stem cells in the context of pancreatic ductal adenocarcinoma. PDAC, as I will refer to it as, is a malignant neoplasm of the exocrine pancreas. It’s currently the fourth most frequent cause of cancer related death and is projected to become the second most frequent cause of cancer related death by 2030. It is currently the deadliest form of cancer with a five-year survival rate of less than 7%. Pancreatic tumors are highly resistant to chemo and radiotherapy and unfortunately, PDAC is usually diagnosed at a very late stage, once the subject or patients have metastasis, due to a lack of early symptoms and reliable imaging methods. This is a terrible disease and unfortunately there are no reliable or effective treatments to treat pancreatic cancer, so any advances in understanding this disease are very much welcome.

13:35 – 15:58, Slide 15

Now, let’s move on to how cancer stem cells can be identified. This is an important point especially when you want to study a disease or a tumor that is mediated by a cancer stem cell. For this webinar, what we will discuss is how can cancer stem cells be identified using flow cytometry. 

15:58 – 14:52, Slide 15

The main method for identifying cancer stem cells has been, and still is flow cytometry which states that advantage of cell surface markers expressed on the cancer stem cell surface. The most common stem cell markers used for pancreatic cancer are shown here on this diagram. For example, EpCAM, CD133, CD44 and CXCR4, common markers used to identify and also isolate pancreatic cancer stem cells. Of course, isolating cancer stem cells is not trivial and it’s extremely difficult and often times depends primarily on the use of antibodies that detects cell surface markers that are sometimes non-exclusively expressed on cancer stem cells. 

14:52 – 16:38, Slide 16

For that reason, we actually set out to identify new cancer stem cells markers that could be identified primarily via flow cytometry but independent of the use of the antibodies. In 2014, we published a study, miniature methods describing a new cancer stem cell inherent marker that we termed ‘auto-fluorescence,’ as you can see here in the publication. What we did is we used primary patient derived xenograft cultures from PDAC samples and we observed in these cultures a very discreet green vesicles in our primary cultures, as you can see here on the first figure on the right. Interestingly, when these cells were cultured in conditions that enriched for cancer stem cells known as spheres, what we saw repeatedly was an increase in this 'auto-fluorescence. At the microscopic level, the auto-fluorescence was restricted to vesicles covered in the ATP transporter, ABCG2, as you can see here on the left in this confocal image, using ABCG2 fuse to mCherry to detect its location and its localization within the cell and the auto-fluorescence has a very specific excitation and emission specter, which is very different than, for example, GFP. Most importantly, we can identify and isolate these cells using flow cytometry without the need for antibodies, as you can see here in these flow cytometric plots, in adherent cultures we can detect a percentage of cells that are auto-florescent and that percentage of cells increases when we put the cells in conditions that promote the enrichment of cancer stem cells, for example, putting these cells in conditions of sphere formation. What we’re also able to confirm is that these cancer stem cells – auto-florescent cancer stem cells, upon injection in nude mice were always more tumorigenic than the non auto-fluorescent cancer stem cells. 

16:38 – 17:26, Slide 17

ABCG2 accumulation on the membrane of these vesicles was only observed in the auto-fluorescent positive cells, as you can see here on the right. Auto-fluorescent positive cells that were sorted and auto-fluorescent negative cells that were sorted, we only saw the accumulation of ABCG2 in the vesicles in the cells that were auto-fluorescent. We later learned that the substance accumulated in these vesicles was actually nothing more than the naturally fluorescent vitamin, riboflavin. Riboflavin naturally fluorescence at very high concentrations, riboflavin can also pass membranes via the presence or via the aid of ABCG2 transporters, so it was not illogical that we would have riboflavin accumulation in vesicles that were coated in ABCG2. 

17:26 – 17:50, Slide 18

Now, since this publication, we’ve extended our analyses of auto-fluorescent cancer stem cells to more tumors as a means, primarily, determining whether cancer stem cells – whether we can determine the cancer stem cell load in freshly resected tumors. In collaboration with local hospitals in the area, we’ve analyzed 92 tumors to-date.

17:50 – 18:21, Slide 19

We’ve been able to detect auto-fluorescent cancer stem cells in a large number of these tumors. Interestingly, our ability to detect auto-fluorescent cancer stem cells is primarily restricted to tumors of the gastrointestinal tract, for example, colon, pancreas, stomach, rectum, gastric and liver cancers. This also highlights that fact that perhaps this cancer stem cell marker, auto-fluorescence maybe something unique to gastrointestinal tumors. 

18:21 – 18:49, Slide 20

Importantly, these cells that we’ve been able to identify in freshly resected tumors can also be isolated via flow cytometry and cultured in in vitro, as shown here. This is Subject 57, which is a rectum cancer. We were able to identify the auto-fluorescent cancer stem cells, isolate these cells and put these cells in culture, and you can see we were able to establish a culture and these cultures maintained their auto-fluorescent vesicles. 

18:49 – 19:27, Slide 21

We’re also able to inject these cells in small animal models, for example, such as zebrafish. Shown here are zebrafish injected auto-fluorescent positive cells and a little auto-fluorescent negative cells, isolated from a colon cancer sample in the yolk sack and we were able to evaluate their distribution 144 hours post injection. You can see that the auto-fluorescent negative cells that were injected were no longer present 144 hours post injection, while the auto-fluorescent positive cells that we injected were still present within the zebrafish and had begun to migrate and expand themselves within the zebrafish model. 

19:27 – 20:00, Slide 22

While flow cytometry is a powerful tool for identifying cell population, many factors should be considered, especially when working with rare subpopulations of cells such as cancer stem cells, and even more scarce populations such as circulating cancer stem cells in blood. For the first purpose of this webinar, I would like to focus on the latter, which are circulating cancer stem cells in blood but the same considerations I’m going to discuss can be applied to cancer stem cells from tumor samples or also from cell lines. 

20:00 – 20:46, Slide 23

When we talk about circulating cancer stem cells, we tend to lump this into the category of circulating tumor cells. Detecting circulating tumor cells has been a very challenging process. The existing paradigm states that circulating tumor cells can be isolated or detected based on the expression of cell surface markers also by filtration by size, other techniques that have used density separation to isolate and detect circulating tumor cells and also, for example, white blood cells on negative selection analysis, so basically eliminating the white blood cells and all the red blood cells and only remaining with the non-immune cells to isolate circulating tumor cells from blood samples.

20:46 – 21:59, Slide 24

To do this however, specialized platforms and instruments are needed. For example, the Veridex Cellsearch system is one platform that is commonly used and FDA approved for certain types of cancers and also other platforms exists. These exclusive and specialized platforms, however, are very cost prohibitive in some settings. Also, CTC’s detected using this platform used antibodies against, for example, EpCAM which is an epithelial marker, or cytokeratins, which you could think are likely not present on cells that have undergone EMT or have assumed a more mesenchymal-like state. Maybe, some of these antibodies that are exclusive to these platforms are not specific for the circulating tumor cell population. Also, these platforms are not designed to isolate or to detect cell clumps or aggregates, so this also reduces our capacity to detect circulating tumor cells, circulating cancer stem cells that may travel as aggregates with support cells which is something that the literature’s now referring to as a very likely possibility for circulating tumor cells. 

21:59 – 23:12, Slide 25

That being said, obviously there is room for improvement. Therefore, we believe that a flow cytometry-based approach is the best option for identifying circulating tumor cells and circulating cancer stem cells. However, you can see here on the right traditional hydrodynamic focusing, which is a traditional methodology used for flow cytometry - using this hydrodynamic focusing cells are not aligned in a continuous stream resulting in many events be lost at the time of sample event acquisition. This tends not to be problematic if the population you’re looking for is present in very high percentages, but when you’re looking for a very rare, small population of cells, this can reduce your ability or the precision or your ability to detect this very small population. To improve upon this, we have moved to acoustic focusing based cytometry using the Thermo Fisher Scientific Attune NxT [Flow] Cytometer that uses an acoustically focused sample stream to align cells for a better precision and increase detection of very rare events, which is the type of cell population that we are looking for. 

23:12 – 24:18, Slide 26

Again, going back to the existing platforms, another inconvenience is that most platforms implement an initial step to eliminate immune cells. We have to ask the question if circulating tumor cells and circulating cancer cells are travelling in circulation with support cells, maybe some of those support cells are actually immune cells. If we implement an initial step to eliminate immune cells, we might be losing those circulating tumor cells or the circulating cancer stem cells. Also, some of the results from these platforms are inconclusive. Sometimes it’s variable and the procedure is time consuming. For example, detecting circulating tumor cells in pancreatic cancer patient samples is extremely challenging and the very Veridex system has not proven beneficial for detecting CTCs in PDAC patients. Therefore, what we need are more specific CTC markers to detect circulating tumor cells and circulating cancer stem cells, as well as newer, faster and easier methods of detection. 

24:18 – 25:00, Slide 27

Regarding previous point number four where I mentioned that the elimination of immune cells from the sample may compromise our ability to detect circulating tumor cells or circulating cancer stem cells, as more studies are indicating that these types of circulating cells actually exist in circulation as aggregates with immune cells, what we’ve tried to do is take information from the literature and apply this to our platform. There have been some very nice recent studies, for example, a study by Adams et al., published in 2014 in PNAS that actually suggests that circulating tumor cells do circulate with immune cells. 

25:00 – 25:37, Slide 28

In this study in particular, what the authors observed was that circulating tumor cells were actually fused to macrophages and the cells in circulation expressed both epithelial cell markers, as you can see here, EpCAM positive, cytokeratin positive, and at the same time, expressed immune cell markers, for example, CD45. They were able to identify circulating tumor cells that were actually expressed markers also present on immune cells and they suggested these cells were actually circulating giant macrophages that had been fused with circulating tumor cells. 

25:37 – 26:25, Slide 29

The question remains: Is this is a very radical concept? In fact, it’s not. If we take a look at many of the studies performed in the Condeelis lab, at Albert Einstein University, they’ve been studying for years how macrophages are necessary for the intravasation of circulating tumor cells. Essentially, what they have shown is that macrophages promote tumor cell dissemination to secondary organs, they’re absolutely necessary for the tumor cell to enter into circulation. Whether or not this is done via effusion process or whether or not the macrophage is along for the ride, it’s still yet to be determined. 

26:25 – 27:01, Slide 30

In addition, we have shown that one way that macrophages may also promote circulation is actually to promote epithelial mesenchymal transition or EMT via secreted factors. Using here, in this example experiment, we’ve used on patient derived xenograft primary cultures and we treated them with M2 polarized macrophage condition medium. What we saw is a clear mesenchymal transition mediated by macrophage secreted factors as early as eight days and as late as 25 days, treating our primary cultures with macrophage condition medium. 

27:01 – 28:49, Slide 31

Our working model is that TAMs, or tumor-associated macrophages, that are present in the tumor actually promote cancer stem cell dissemination by promoting epithelial mesenchymal transition, or EMT, via secreted factors or perhaps also by directly interacting with cancer stem cells or cancer cells to promote this fusion event. The latter is actually this idea of TAMs promoting - cancer stem cell dissemination is actually biologically relevant in the case of pancreatic cancer as CD163 positive macrophages, or TAMs, are the main immune cell infiltrate present in PDAC. You can see this here in this immunohistochemical staining of a patient of pancreatic tumor, you can see here the massive infiltration of CD163 positive macrophages. We also see in blood samples, so here we have a sample from a healthy control. We can see a monocyte macrophage population of about 1.2%. However, when we look at the same macrophage monocyte population in blood from metastatic pancreatic cancer sample, we see about a three-fold increase in the percentage of these cells present in circulation.

If we look specifically at this population and we use a marker for M2 macrophages, or TAMs, tumor-associated macrophages, here in this example, we’re using CD163 and via flow cytometry, what we see is that we see a 44.6% of the monocyte macrophages are CD163 positive, indicating that there’s an increase in M2, TAM-like macrophages in circulation which are likely coming from the primary tumor. 

28:49 – 29:08, Slide 32

The hypothesis is that macrophages promote cancer stem cell dissemination, intravasation, circulation and facilitate their extravasation to secondary organs in order to form metastasis. 

29:08 – 29:40, Slide 33

This hypothesis was recently supported by a publication in 2017 by Classen et al, where they observed fusion between circulating tumor cells in TAMs, as shown here, with a PDAC cell marker, which is ALDH1 and a macrophage marker, here they used CD68. You can see that the cells, that they were able to isolate from blood samples, were both ALDH1 positive and CD68 positive, again indicating that perhaps a fusion event between tumor associated macrophages and circulating tumor cells. 

29:40 – 30:06, Slide 34

We used, basically, the same approach but applied it to a flow cytometry platform. We used CD163 and our auto-fluorescence as our markers to detect double positive cells. You can see here in Sample 38, we were able to detect the double positive population, 3.24% of the cells were both positive for auto-fluorescence and positive for CD163. 

30:06 – 30:59, Slide 35

When we isolated this population, we observed that these cells could form spheres in vitro, these spheres also retained their auto-fluorescence phenotype, they were able to form tumors in nude mice, metastasized to the liver, following injection. Shown on the right is a table showing that CD163 positive macrophages increase in PDAC samples compared to healthy samples and we also see that it increased PDAC in double positive cells, both positive for auto-fluorescence and positive for CD163 in our PDAC sample. These data indicate that these TAMs, CTC fused cells could be used to develop a liquid biopsy assay. 

30:59 – 31:35, Slide 36

To move this to a more feasible screen, liquid biopsy assay, we are now exclusively using acoustic based cytometry which allows for more precision and the ability to detect these very rare events when we compare this type of cytometry to more conventional cytometry, or to the traditional hydrodynamic focusing cytometry. With acoustic cytometry, we’re able to detect more events, sometimes anywhere from 3% to 5%, three to five times more events than we were able to detect using traditional hydrodynamic based focusing.

31:35 – 33:30, Slide 37

Here in this slide is an example of the deduction limits of this approach. In this experiment what we used was fixed blood samples and we spiked our fixed blood samples with pancreatic ductal adenocarcinoma cells. We analyzed on the Attune NxT [Flow] Cytometer using the violet laser together with the No-Wash No-Lyse Filter Set, to gate on nucleus cells without lysing the sample. The No-Wash No-Lyse filter set allows one to gate directly on nucleated cells, therefore the sample does not need to be lysed and the sample does not need to be washed; you can directly fix the sample – or not fix the sample, add your antibodies to the blood and detect your cell target without having to introduce extra variables such as lysing red blood cells, which can damage your cells as well and the multiple number of washing steps that can reduce your ability to detect cells and you can actually lose some of these very rare cells that you’re trying to detect.

What you can see here, within the population we detected EpCAM positive PDAC cells with high efficiency and precision, demonstrating the utility of this approach. In the first column, you can see here, this is blood where no PDAC cells were introduced and we have 0% detection of EpCAM positive PDAC cells. Then, we did dilutions from tenth to the sixth [10^6] down to ten to the third [10^3] and we were able to identify very well - when we should have detected 200,000 cells, we were able to detect 106,000 cells, 20,000 cells, 11,000 cells, 2,000 cells, 1,300 cells and 200 cells we were able to detect -  in theory, 200 cells we were able to detect 98 cells, showing very well the utility of this approach, the precision and the efficiency using the Attune NxT cytometer. 

33:30 – 34:27, Slide 38

Our main hypothesis therefore, is that circulating cancer stem cells enter into circulation accompanied by tumor-promoting macrophages. We want to apply this methodology to primary samples using CD163 as our marker for the tumor-associated macrophages and our two markers for the cancer stem cells and the tumor cells would be auto-fluorescence and EpCAM, respectively. Using the Attune NxT No-Wash No-Lyse filter gating strategy, we can see here the nucleus cells which are present within the gate within that population, we can see very nicely that distribution of the immune cells; granulocytes, T cells, monocytes, etcetera. Within this population, we can specifically detect the CD163 positive, or tumor-associated macrophages using antibodies against CD163. 

34:27 – 35:07, Slide 39

As you can see here in this analysis, here we have a healthy sample and within the CD163 positive population, we were not able to detect EpCAM positive cells nor auto-fluorescent cells in this healthy sample. However, if you look at the pancreatic cancer sample, we were able to detect very nicely within the CD163 positive population 2.18% of the cells that are EpCAM positive and 28.6% of the cells that are auto-fluorescent positive, indicating that our target population, this double positive population is present in pancreatic cancer samples but not in healthy samples. 

35:07 – 35:46, Slide 40

If we look here in this analysis, we can see that, which is very interesting, those samples with the highest numbers of double positive cells which are indicated here with the red bars, correlate very well to lower overall survival and lower progression free survival. Those samples where we were able to detect high percentages of this double positive population, these samples correlated very well to low overall survival, which is basically what we’re trying to look for, is a marker that will allow us to predict perhaps overall survival or progression-free survival. 

35:46 – 36:34 , Slide 41

We envision cytometry-based liquid biopsy assay for the detection of circulating tumor cells that could be used for countless purposes in hospitals, with the main goal, however, of advancing personalized medicine options for cancer subjects. We envision this technology being used, for example, for quantifying circulating tumor cells or circulating cancer stem cells before and after treatment to determine drug efficacy or chemotherapy efficacy. We also envision that this platform could be used for monitoring circulating tumor cells or circulating cancer stem cell population post-surgical intervention and perhaps assessing the appearance of circulating tumor cells or circulating cancer stem cells as a predictive marker of cancer development, or reappearance during remission. 

36:34 – 37:00, Slide 42

Of course, as with all platforms or all assays, sample preparation and choosing the right panel markers is very important and needs to be empirically determined by each user. 

Our workflow is very specific to the type of tumor we work with and the type of cancer stem cells we work with and the cancer stem cell markers that we have chosen. 

37:00 – 40:12, Slide 43

It’s important that for sample preparation and choosing the cancer stem cell markers that each tumor may be different. What I will present here is our workflow and the requirement and the preparations that our samples undergo for our analysis. However, it’s highly recommended that each user that develops a similar liquid biopsy assay empirically determine their conditions. Our samples come from the hospitals and the samples are acquired from our patient population of interest.

These samples, for example, blood samples are collected in various different types of blood collection tubes, for example, Heparin tubes, EDTA tubes. The nice thing about our platform is that you can choose whether or not to fix the cells or not fix the cells. If you choose to fix the cells, we have been using the TransFix Cellular Antigen Stabilizing Reagent, which work very well for stabilizing the antigens on the cells and the antigens are actually stable for up to 10 days at 4 degrees Celsius. Now, as I mentioned, before we take these samples and we run them through the Attune NxT [Flow] Cytometer, we do what we call a sample clean up by using the violet laser in conduction with the No-Wash No-Lyse Filter Kit, which allows us to gate directly on the nucleated cells and then this is the population of interest. No need to lyse the cells, no need to wash, we can immediately go the nucleus cells and determine whether or not our circulating tumor cells or circulating cancer stem cells are within the nucleated population.

If we don’t fix ourselves, what we tend to do is also add a dead cell exclusion marker, so that we can exclude all dead cells and focus just on the live, nucleated cells. Of course, you have to choose your antibodies of choice. Here for example, antibodies that can be used to directly mark your immune cells of interest, if TAMs or tumor-associated macrophages are of interest, we tend to use the Invitrogen antibodies against CD45 and CD163, however, other markers can be added, markers for myeloid cells, markers for T cells, etcetera. Of course, you have to choose which cancer stem cell markers are appropriate for your tumor type and for your conditions and whether or not you are trying to identify circulating cancer stem cells or cancer stem cells directly from tumors. For example, if you used EpCAM to identify those epithelial cells and blood samples, pan-cytokeratin for example, and of course, we also include our auto-fluorescence marker. Once the staining has been performed, we can run these samples very efficiently and thoroughly through the Attune NxT [Flow] Cytometer, we do a multicolor analysis and quantification - you can actually quantify the number of events or the number of CTCs, or circulating cancer stem cells using a CountBright Absolute Counting Beads. In the end, obviously, including your healthy samples, together with your experimental samples, will allow you to gain confidence in your results and to compare the signal you received from the experimental sample compared to the healthy sample.

With that, I would like to thank you all for your attention. I would be more than happy to further explain any points in the presentation or to answer any questions.

40:12 – 40:30, Slide 44

Thank you, Dr. Sainz, for your presentation. A quick reminder for our audience on how to submit questions, simply type them into the drop-down box located on the far left of your presentation window labelled “Ask a Question”, and click on the send button. Dr. Sainz will answer as many questions as time permits. 

40:30 - 41:42, Question 1

The first question is, “Can you combine the auto-fluorescence with a well-known cell surface marker to increase the ability to detect cancer stem cells from tumors or from blood?” 

You can definitely combine the auto-fluorescence marker with any marker of interest. As you read in the literature, just about every month a new stem cell marker is being identified, so the ease of the auto-fluorescence is that it’s anti-body independent, which then allows you to include more antibodies without having issues of isotype controls, etcetera. The only limitation, however, is your channel for green fluorescence has been occupied by the auto-fluorescence, but you can definitely combine other markers based on the number of lasers you have, especially if you use the violet laser, then you have a larger gambit of antibodies that can be included. 

41:42 - 42:35, Question 2

Thank you. “Why do cancer stem cells accumulate riboflavin in these vesicles and do you think there is a biological benefit?”

That’s an excellent question. Why do they accumulate the riboflavin? At the biologic level or the mechanistic level, we know they’re accumulated because they over express ABCG2 on the surface of these vesicles, so naturally, riboflavin is likened for the ABCG2 transporter. It will pass through the membrane that has this transporter and the fact that the vesicles over expressed them, the natural result is its accumulation within the vesicles. Now, whether or not this accumulation of riboflavin has any sort of biological benefit, that I think we still do not know. Obviously, riboflavin has many biological functions, it’s a precursor for the flava protein, so it may be that the cancer stem cells are accumulating riboflavin to ensure the production or the availability of flavoproteins, which function in many different biologic processes, but that question is still open and unanswered. 

42:35 - 43:36, Question 3

Thank you for that. The next one is, “Is there one universal marker that could detect all the cancer stem cells subpopulation across all tumor entities?”

That would be fantastic, and I wish. It is nice that the auto-fluorescence biomarker – it seems to be a very good marker for identifying cancer stem cells of gastrointestinal origins. The only thing I would be cautious to say is that do there exist cancer stem cells that are not auto-fluorescence positive? Perhaps and most likely. Do there exist cancer stem cells that express markers that we have yet to identify? Probably. If we limit ourselves to one marker, we are just getting a glimpse of the cancer stem cell population that expresses that marker and we should also keep in mind that there are other cancer stem cell populations that probably express other markers, that if we don’t use those markers, we’re losing that information as well. 

43:36 - 44:51, Question 4

Thank you. Looks like we have time for one more question. “What do you recommend for number of cells required for analysis?”

I can answer in two parts: From primary tumors and from blood samples. From primary tumors, I would suggest, at least if it’s possible, anywhere from five to 10 million cells is a good amount of cells to do an analysis for cancer stem cell percentages. We have to keep in mind that depending on the tumor type that you’re studying, the cancer stem cell population can be anywhere from 0.5% to 5% to 10%, so that needs to – based on those percentages, you can extrapolate on the number of cells that you would need. For merely detecting the cells by flow cytometry, you can obviously reduce those numbers of cells and obviously, with the Attune NxT [Flow] Cytometer, you gain greater confidence and precision because you’re able to capture each and area of event. If you’re using traditional cytometry, you might want to increase those numbers. From blood samples, a good starting volume is about 5 mLs of blood, which can be used and marked and analyzed as mentioned in this presentation. 

44:51 - 45:52, Moderator

I would like to, once again, thank Dr. Sainz for his presentation. Do you have any final comments? (45:01)

No. I would like to thank the opportunity to present our work and the experiments and the research that we’re doing in our laboratory, of course. I thank, also Thermo Fisher Scientific for this opportunity and you all for your attention.

Thank you for that. I’d like to thank the audience for joining us today and for their interesting questions. Questions we did not have time for today will be addressed by the speaker via the contact information you provided at the time of registration. We would like to thank our sponsor, Thermo Fisher Scientific, for underwriting today’s educational webcast. This webcast can be viewed on-demand through June 2019. LabRoots will alert you via email when it’s available for replay. We encourage you to share that email with your colleagues who may have missed today’s live event. Until next time, good bye.

Analysis of Surface Antigens on Exosomes using the Invitrogen Attune NxT Flow Cytometer

Join Steve McClellan as he shares how he analyzes individual exosomes for the expression of a variety of surface markers using multi-color flow cytometry.

00:00:00 - 00:00:54, Slide 1

Hello. We’re glad you’ve joined us for this live webinar, Analysis of Surface Antigens on Exosomes using the Invitrogen Attune NxT Flow Cytometer. I’m Julia O’Roarke and I’ll be moderating this session. Today’s educational web seminar is presented by LabRoots, the leading scientific social networking website and provider of virtual events and webinars advancing scientific collaboration and learning. Brought to you by Thermo Fisher Scientific. Thermo Fisher is a world leader in serving science whose mission is to enable customers to make the world healthier, cleaner, and safer. Thermo Fisher Scientific helps their customers accelerate life scientist research, solve complex analytical challenges, improve patient diagnostics, and increase laboratory productivity. For more information please visit www.thermoscientific.com.

00:00:54 - 00:01:49

Let’s get started. You can post questions to the speaker during the presentation while they’re fresh in your mind. To do so, simply type them into the dropdown box located on the far left of your screen labeled Ask a Question, and click on the send button. Questions will be answered after the presentation. To enlarge the slide window, click on the arrows at the top right-hand corner of the presentation window. If you experience any technical problems seeing or hearing the presentation, just click on the support tab found at the top-right of the presentation window or report your problem by typing it into the answer a question box located on the far left of the screen. This is an educational webinar and thus offers free continuing education credits. Please click on the continuing education credits tab located at the top-right of the presentation window and follow the process of obtaining your credit. 

00:01:49 - 00:02:52

I now present today’s speaker, Steve McClellan, the manager of Basic and Translational Research Operations at the University of South Alabama Mitchell Cancer Institute. He also serves as chief of the flow cytometry core laboratory. Steve has more than 30 years of expertise in advanced flow cytometry and cell sorting. He has worked in basic and clinical research in the areas of cancer biology, stem cell therapy, transplant immunology, and xenotransplantation. For the past 12 years, his lab has been conducting research on cancer stem cells and circulating tumor cells. The aim is to develop better methods of purification, culture and analysis at both genetic and functional levels and using cancer stem cells in high throughput screening drug discovery. Exosomes are the latest addition to the lab’s research efforts. His complete bio is found on the LabRoots’ website. Steve McClellan will now begin his presentation.

00:02:52 - 00:03:45

Thank you. I would like to share with you some of our recent progress in using the Attune NxT [Flow Cytometer] to analyze surface markers on exosomes. Some of the first published reports on exosomes date back to the mid-80s and the field actually began in the world of platelet biology in reticulocyte maturation. Since that time, the number of references in publications talking about exosomes has grown exponentially. Around the turn of the millennium is when we see the first reports of using flow cytometry to actually look at exosomes and in the last five to 10 years, we’ve seen yet another explosion in those publications due to the improvements in the sensitivity of our instruments and also highly defined protocols to add rigor to this what I consider to be a rather challenging field. 

00:03:48 - 00:05:16, Slide 2

Let’s talk about exosomes and microvesicles. These are actually vesicular particles that are shed from cells comprised of a lipid biolayer and containing representative cytosolic content from their cell of origin. This can be DNA, RNA, especially micro RNAs, and proteins. Exosomes have been shown to affect intercellular communication. I think early on, they were actually considered to be garbage bags. People thought that the exosomes were being used as a way to shed extracellular debris and waste products, but we now know without much doubt that this is how cells communicate with each other. We’ll talk a little bit more about that later. The surface markers on these small exosomes can also be representative from the cell of origin matching some of the same epitopes on the surface of the cell of origin. We generally consider exosomes to be the smallest of these particles ranging in size from 30 to 150 nanometers. The next larger portion of this fraction of vesicles are what we call MVs or just microvesicles and those are going to range in size from about 150 to 1,000 nanometers.

00:05:16 - 00:06:22

In the last 10 years, exosomes have been implicated in just a myriad of different disease processes. Almost every disease process has had some type of work looking at the exosomes related to it. We work on cancer but especially in cancer, exosomes are very critical. We now know that cancer cells secrete upwards of 20 times more microparticles than do normal healthy cells and in the setting of cancer, these actually have direct effects on the healthy target cells that they bind to once shed from the cancer cell, doing a myriad of things: promoting growth and metastases, vascularization of the new growth of the tumor, and also most importantly contributing to immune suppression. We have evidence that the exosomes leave the tumor cells and actually interact with the T-cells to try and dampen some of the immune response. 

00:06:22 - 00:07:34, Slide 3

This little cartoon depicts two different ways of how these are formed. The larger microvesicles actually bud off of the surface. Literally pinched off and just released away from the cell. Hence, they’re much bigger. True exosomes are developed through an endocytic process so small clathrin-coated pits actually invaginate from the exterior of the membrane, forming what we call an early endosome. That little packet matures into then what we call a multivesicular endosome and at that point, those exosomes can either be released back to the surface if that MVE is actually directed to fuse back to the membrane and just re-vaginate, I guess I should say, and then expose the exosomes to the exterior surface or lysosomes will bind on to that to destroy the contents of the MVE.

00:07:34 - 00:09:03, Slide 4

There are several ways to isolate exosomes. We’ll start with precipitation. Well, let me start by saying that there are really two primary sources that you’re going to use to isolate exosomes. For basic research, you can easily use culture supernatant. Cells grown in culture constantly shed exosomes. It can be a very nice clean source to look at specific markers or micro RNAs of interest. There’s a variety of body fluids that people have isolated exosomes from: urine, blood, which would be really plasma and that distinction is important. Clotted blood, which would be referred to as serum, is not a good product to use. Many of the exosomes get trapped in the clotting process and so your yield will be far lower if you used serum versus plasma. They’ve been found in saliva, cerebrospinal fluid. Just about any fluid in the body is going to have exosomes in it. You can precipitate these. There are, in the last five years especially, we’ve seen the explosion of kits on the market that actually allow you to precipitate these small vesicles. 

00:09:03 - 00:11:12

In our hands, these kits don’t provide very good purity. They’re quick. Usually within an hour, you can isolate the little particles but we’ve seen that they might not even be suitable purity for even doing western blots. We certainly have not had any success using precipitation kits for purifying them to do flow cytometric analysis. They seem to be very sticky after whatever chemicals are used to do that precipitation, so pretty much the only thing we found them useful for was really working towards DNA or RNA isolation at the end of that process.

The size exclusion filtration can happen in two ways. You can use membranes with a known pore size, say 100 microns to do the filtering of the largest vesicles and then the exosomes, smaller than 100 nanometers would come through. We’ve played with this a little bit. They tend to clog pretty quick [Laughter] and so I don’t think it’s a very time-efficient way of doing it. I think if you’re really going to do size exclusion, the best way would be through chromatography, small columns with some sort of chromatography material in it that really can just do size exclusion so the smallest particles run through first and you collect fractions, and those can provide very high purity. Sometimes you worry that you might not get full recovery so your yield might be a little lower in that method and it can certainly never be considered quantitative because each sample might not travel through the chromatography bed quite the same, but they do come out very clean in that respect so we like that for flow cytometry.

00:11:12 - 00:12:26

Back in the olden days when people first started trying to do flow cytometry, immunoaffinity beads were really the way that everybody did this. Small magnetic or polystyrene beads coated with specific antibodies that bind exosomes - we’ll talk a little bit more about these markers later: CD9, CD63, and CD81 - that capture and localized the exosomes on to the beads. You can then wash and have a very pure preparation. You could go directly to isolating the DNA, RNA, or proteins from those exosomes at that step or you could use more antibodies to label the surface of those exosomes that are now immobilized on to the bead and do flow cytometry, and that’s really what they were aiming for before the flow cytometers became sensitive enough to actually resolve the individual particles which we’ll get to in just a second. In the early days, you really could only see small beads on the flow cytometers and that was really the only way you could qualitatively assess what was on the surface of an exosome, by using the bead capture method. 

00:12:26 - 00:13:24

We and many others considered differential centrifugation to be probably the best way to isolate exosomes, at least with very good purity. It’s highly time-consuming. As you can see on the right, there is a set of successive spins, so each preparation will take a minimum of six hours really if you push right through it. Well, you can’t do too many because most of the centrifuges only have six tubes in them [Laughter] so you can only process six things at once. It is time-consuming but in our hands, this provides probably the most pure product that we can work with, especially when we’re looking at doing flow cytometry where they’re not left with any kind of sticky coating and we don’t get too much non-specific antibody binding.

00:13:24 - 00:14:23

As we go through each faster spin, you generate the different levels so first, we’re going to rule out the cell and cell debris particles at the first spin. The large vesicles from the second spin are considered apoptotic bodies and another type of vesicle called a large oncosome. We’re not going to talk too much about those today. The next 16,500g spin generates the medium size, what we would consider microvesicles in that 150 to 1,000 nanometer range. Then we’ll take that supernatant, put it in the ultracentrifuge for two hours, and those are the exosomes, the small vesicles that are actually in that range of 30 to 150 nanometers. Leftover juices, really just soluble material. There’s not much left after the ultracentrifugation spin in the supernatant. 

00:14:23 - 00:15:38, Slide 5

Early on, when we started working with exosomes and try to do flow cytometry with them, we came up with an idea to use ultra-filtered solutions to reduce the burden of debris in those solutions that we used to prepare the exosomes and also in the actual focusing fluid or sheath fluid that runs through the flow cytometer which helps us to improve background noise. Traditionally, sheath fluid is only filtered down to about 100-nanometer particle size and that’s roughly the same size as exosomes. By using a PES Membrane that has only 30-nanometer pores and that way, all of the solutions really don’t have any debris particulates that are larger than the size of the particle we’re looking for. We believe and we’ve noticed a huge significant difference in sensitivity of the instrument when we were able to do this ultrafiltration and think that it’s really a key point in making this much more successful.

00:15:44 - 00:17:16, Slide 6

Once you’ve done whatever method you’re going to do to isolate your microvesicles and exosomes, you need to be able to quantitate really what size particle you’ve isolated. You’ve got to make sure that what you’re calling an exosome is truly the right size. There are several ways to do this. Unfortunately, all of them involve the use of yet another instrument to judge the size. Probably the most popular at this point in time is NTA, nanoparticle tracking analysis. These instruments actually use a laser to illuminate the particles in the solution but it’s an imaging-based system so it actually watches the vesicles move around in solution and it’s using the old-style technique of Brownian in motion, so the laws of physics that cover Brownian in motion, how it develops algorithms to actually calculate the size of the exosomes. It’s very quick and it can also at the same time give you a relative concentration of the number of particles per milliliter, so like I said, that’s pretty much become one of the most popular systems to use for sizing exosomes these days.

00:17:16 - 00:18:52

A newer version of a sizer is TPRS, Trace Resistive Pulse Sensing. It’s a mouthful. It’s actually based on the old culture principle of sizing particles, just on a nanoscale. In this method, a membrane has a very small nanometer hole in it and every time a particle crosses through it, an electrical charge is developed from one side to the other side of the membrane and that is responsive to the size of the particle. It’s highly accurate. It also gives you a very accurate method of concentration. These instruments haven’t been around that long and I think that over time, they’re probably going to overtake NTA because in my opinion, I think they’re probably quite a bit more sensitive. DLS, Dynamic Light Scattering, is also a laser-based system. The nanoparticles are in suspension. The laser hits and as light diffracts off of and scatters off of the particles, there are a number of different detectors at different angles that are able to calculate the size based on the type and light scatter pattern. These are also considered to be pretty accurate but the problem with DLS is it does not give you any kind of idea of the concentration of the particles in the solution.

00:18:52 - 00:19:50

Last but not least, the gold standard. If you want to measure the size of something, you look with an electron microscope, and that’s true for exosomes as well. Traditionally, transmission electron microscopy, TEM, has been used. The newer modality of cryo-EM is probably better. It helps preserve the structure a little better but either work fine, and really many people want you to at least show that to make sure that you’re correlating that the other instruments that you’ve used actually are accurate. We do from time to time submit samples for EM, just to double check to make sure we’re on track with everything. Quickly, I’ll mention, I think I will say it later but here at our center, we use DLS for sizing our exosome.

00:19:53 - 00:22:16, Slide 6

Let’s talk briefly about the Attune NxT [Flow Cytometer]. This is a different beast. The Attune NxT is the only system on the market that uses acoustic waves to actually focus the cells into a very tight and uniformed line as they flow past the laser interrogation point. All other flow cytometers use the tried and true principle of hydrodynamic focusing which uses high pressure sheaths going through the flow cell, which causes a swirly effect and then forces those cells into a tight line. When you look at acoustic focusing versus hydrodynamic focusing, you can achieve much tighter PICCVs compared to traditional cytometers. It really is aligning those particles much more precisely as they cross through the laser beam. Another advantage of the hydrodynamic focusing is that it gives us very, very discrete control over the speed with which our particles are flowing past the laser beam, so the benefit of that is that the Attune [NxT] can actually process samples much faster than just about any cytometer on the market. For us, the flipside of that is that we can now slow them way down. We run at the very slowest dynamic rate which is 12.5 microliters per minute of sample injection through the laser beam, and what that allows us to do is actually increase and maximize the dwell time that each little particle spins as it crosses that laser beam. The slower you go, the longer – and of course, we’re talking microseconds but it’s enough to make a difference in the amount of photons that could be captured off. Running slower in this system really does increase the sensitivity and that you can collect more photons from each particle and be a brighter signal. 

00:22:16 - 00:23:21

Not only do we have the fluidics part, but the Attune does have fairly high-powered lasers. It can provide 50 milliwatts on the 405, 488, 561 nanometer lines and 100 milliwatts at the far red 647 line. It does that in combination with something called flat-top beam shaping optics. It’s another mouthful. Really what that means is that instead of focusing the laser into a very tight spot which the cells may or may not correctly intersect, the laser beam is actually 10 nanometers by 50 nanometers wide. What it means is that the alignment of the instrument is almost irrelevant because the particle will always be crossing the maximum width of that laser, which also helps with the sensitivity we’re talking about. It just makes it more robust.

00:23:21 - 00:24:31

Part of our collaboration that we’ve done with our friends at Thermo Fisher in the Invitrogen headquarters in Eugene, Oregon have been assisting them to evaluate new band-pass filters and we’ve been able to come up with a different filter that is normally supplied with the original instrument in front of the side scatter detector and this has greatly increased the sensitivity of the system. I think that we and others in this collaborative effort have come to an agreement that new band-pass filter really does make a huge difference, and as I mentioned before on the previous slide, the ultra-filtering of the focusing fluid really does dramatically decreased the background noise that we see in running samples. 

00:24:31 - 00:25:55, Slide 7

How do we validate that a flow cytometer can actually see exosomes? Quality control material at this point in time primarily exist of microparticle beads just like any other beads that you would use to assess the performance of your flow cytometer, but in this case, they’re much smaller. Presently, the industry standard is considered to be the Apogee Flow Systems beads called ApogeeMix. They consist of a range of six silica particles and two polystyrene particles which actually have green fluorescence embedded into them. The distinction between silica and polystyrene is that silica, the material itself, has a refractive index closest to what the refractive index of true vesicular particles are. It’s really considered to be the best closest reference standard particle that we can use. We also use the Biocytech’s MegaMix particles. These are also polystyrene with some fluorescence embedded and we actually like to mix those two.

00:25:55 - 00:27:09

On this histogram, you can see that the top two peaks are the Apogee polystyrene beads. I’ll first start off by saying, we generally do not see those matching any size of any other particle. I’m not sure why. We basically just ignore those, but when you move down to the MegaMix polystyrene beads, you can see three populations of 500, 900, or 3,000 nanometers which is almost off-scale and the red boxes indicate that those polystyrene beads actually line up pretty well with the similar size silica particles. If the 500-nanometer Mega Mix lines up with the 590-nanometer silica particle from the Apogee beads, similarly the 800 to 880. Sometimes we don’t always resolve that 1,300-nanometer silica particles. Sometimes we see it, sometimes we don’t. We’re not really sure why, but again, I don’t really care because I’m more interested in what’s going on to the left of that.

00:27:09 - 00:27:55

From that point on, the 500, then we’d resolve the next smallest which is at 300 to 240. We’ve got the 180. Around the 10 to the fourth hash mark is where we’re starting to see those smallest particles and that’s going to be the range that we would expect to see exosomes appear when we run them on a cytometer. Much below 10 to the fourth is really pretty much considered electronic noise, the large orange portion in the corner is probably the electronic noise and then perhaps that next cloud over would be our 30-nanometer background particles after doing the ultrafiltration.

00:28:01 - 00:30:12, Slide 8

As long as the beads look good, I’d say yes, I think my cytometer can see small particles. Well, let’s test that. We’re going to make some exosome fractions and do our successive ultracentrifugation spins, and far left, you can see where we’ve run just 500 microliters of ultra-filtered PBS. Again, small amount of electronic noise that’s run at a threshold setting of 200 and generally, we move that threshold up to about 500 which is where that red line is indicating so that we cannot count noise particles, and then the rest would just be the 30-nanometer or less debris zone in that PBS. When we run exosomes on the next panel over, we see that they are pretty much in that very small zone of we would hope to be 30 to 150 nanometers and just points of reference, there’s very little in that gate on the right. The next larger spin, microvesicles obtained from 16,500 show the concentration in this density plot moving to the right, so we’re getting bigger, up to about 33% in that right gate. Then our apoptotic bodies, which were the first spin at only 2,000g, shift that density plot even further to the right and now we’re at about 54%. Not only do we have confidence that we’re seeing the exosomes but the other things that we know to be in an appropriate size range are all falling into pretty much the correct size, so when we see things like this, we have fairly good confidence that the instrument is needing individual exosomes and they’re putting them on scale pretty much in the right spot in correlation to our reference beads.

00:30:18 - 00:31:45, Slide 9

Next, we have to make sure that the instrument is actually analyzing individual exosomes. This is critical. Many of our preps get highly diluted on the range of anywhere from one to 100, sometimes up to one to 10,000, depending on how concentrated these preparations are made. We generally take the pellets from the ultracentrifuge spin and re-spin it in a fairly small volume, which means that we then have to do pretty large dilutions to get to the point where we want to do this kind of flow cytometry work to validate that we aren’t seeing what we call a swarm effect which would be multiple particles coming through the interrogation point of the laser at the same time, by doing these simple dilution series. They make a starting to one to 100 dilution, run it, and see that we get 5711 counts per second. The excessive one to two dilutions of that with ultra-filtered PBS, in theory, should cut that in half so the number in the center is the number that we actually acquire, and then the theoretical number on the far right corresponds to half of 5711, and then you just cut those in half. You’ll see that does match fairly well.

00:31:45 - 00:32:30

I’m sure that there are probably some acceptance criteria of what percent variants can you accept is tolerable. To me, something like this looks perfectly fine. That gives me perfect confidence that we were literally are putting only one particle passed the laser at a time, and that’s been one of the biggest problems in cytometry. Now we’re able to actually resolve these individual particles, we got to make sure that we have our sample diluted enough that they literally are going by one at a time. This is a critical, critical quality control step that we do with every single exosome prep that we make. 

00:32:30 - 00:32:48, Slide 10

Now, I guess we need to figure out what we’re going to do with these exosomes. What are we going to look for? We’ve got one more little quality control step to go through.

00:32:48 - 00:35:02

When you do this ultracentrifiguration step, you do get some other things pelleting besides exosomes, so these might be small protein aggregates but we’ve got to be able to exclude what would be true debris in that final spin versus true lipid bilayer of vesicular particles. We do that using a lipophilic dye which is going to allow us to really say that’s a vesicle, that’s not, and make that decision. We have tried a lot. We’ve settled on a product from Thermo Fisher known as FM 1-43 and FM 4-64. These are lipophilic styryl dyes that have the ability to intercalate into the outer leaf of the membrane. My favorite part of this molecule is that it’s not fluorescent until they actually intercalate into that outer membrane, similar to like a DNA dye binding into a DNA strand. What this especially has allowed us to do is to develop a no-wash staining particle, so we can add enough of this dye to stain but it’s not enough to cause problems with anything else. The debris particles will not pick it up, and so this has become our favorite dye to use for this lipophilic ID. The key point of trying to develop a no-wash staining protocol is that every wash takes about at least two and a half hours. By the time the ultracentrifuge comes up to speed, you spin for two hours and you spin it down. There’s no such thing as a quick wash when you deal with exosomes. We’ve worked very hard on trying to develop our no-wash staining protocol. 

00:35:02 - 00:36:45

There are other lipophilic dyes you can use. BODIPY TR Ceramide is actually just a lipid molecule with a fluorescent tag bound on to it that will also intercalate into the outer membrane of the vesicle. One thing I think I just realized I forgot to tell you is that the FM 1-43 is excited by a 488-nanometer laser and it emit its light in the 525 to about 625-nanometer range, FITC to the PE area. Then FM 4-64 needs green to 561-nanometer laser excitation and it’s going to emit further down in the red range 700 to about 700-nanometer. You have two different color options there to choose as you also look for what antibody markers you want to. We always do this in combination with antibody staining. The BODIPY is actually only available in one color. It uses a 561 excitation and emits light in the range of 575 to about 625. We have not really tried this with or without washing. It may be possible to use that one as well without washing, but we have not validated that.

00:36:45 - 00:38:21

The other options, the DI Series, which are actually carbocyanine membrane dyes, have been widely used for many, many years as membrane labelers to do in vivo cell tracking. They’re mixing cell-fusion experiments that have been used for many years. As well as PKH dyes which are similarly able to intercalate into the outer membrane. PKH has actually been made into a whole rainbow of colors. I think there are actually eight different fluorescent molecules across the whole span of the range that a cytometer can detect. The big problem with these two kinds of dyes is that you require multiple washes to get rid of all of the excess dye. They’re really a pain to deal with. It’s what we started with. We started with PKH and actually, [Laughter] one of my students was reading through a paper and saw the reference to the FM 1-43 and we ordered it and tried it, and I was like, “Okay, never again.” [Laughter] That’s where we stand. We believe that’s probably your best choice, but there are many options there. Once you’re going to do that staining, now we can say what’s debris, what’s a true lipophilic potential exosome. 

00:38:21 - 00:39:30

Now, if we’re going to use antibodies to look at three gold standard reference molecules, those being CD9, CD63, and CD81, you cannot call a particle an exosome unless it has at least one of those markers on the surface. Let me repeat that. If you don’t see at least one of those three CD markers on the surface of the exosome, you cannot call it an exosome. These are fluorescently, directly conjugated antibodies and we’ll talk a little bit more about the way we do this. As I’ve said, we’ve got a no-wash protocol so we are able to actually - even though we did do DLS to measure the size of our exosomes, another benefit of the Attune NxT cytometer is that it uses volume metric counting, so it can give us a fairly accurate concentration of the particles in there.

00:39:30 - 00:40:50

When we’re doing those dilution tests, we’re also validating what the actual particle concentration per milliliter is in that particular sample. Knowing that number, we aim to put about 250,000 exosomes in 250 microliters of ultra-filtered PBS. We generally just use standard 12 by 75 polystyrene FACS tubes. We add our FM 1-43 dye at a final concentration of 0.5 micromolar. Then we add our antibodies according to the titrated optimum concentration which is generally somewhere between the range of 50 to 200 nanograms per mil. We incubate the sample at room temperature for 10 minutes and then immediately place them on ice and try to run them on the cytometer as fast as possible. It’s a fairly straightforward standing protocol and we’ve done a lot of work to [Laughter] optimize that. Trust me. Titration of your antibodies is very important as it is in all aspects of flow cytometry. One must really titrate your antibodies to do good work.  

00:40:50 - 00:42:29, Slide 11

This is an example of what the lipophilic dye looks like. Center panel is unstained exosomes. I’ve popped up the reference of beads just to show you that we are in that range around that 10th to the 4th hash mark. The threshold was set a little higher when these samples were collected but we are right there in that size zone for exosomes. Then this gate on the right is going to actually show you the particles, big good percentage. I think we’re 84% on that one. We do see some variability so you could go anywhere from 50% on the low end up to maybe 90% on the high end using deferential centrifugation of the purity and that’s why we do all these QC steps for every single preparation we make. Differences in, of course, what the sample starts from, be it culture supernatants versus a body fluid but even just - we would consider fairly uniformed culture supernatants can provide different purities in preparation. It’s really important to do all these upfront QCs on every single preparation. We use this positive gate on the right to then look at antibodies. 

00:42:29 - 00:44:05, Slide 12

We’ll gate from here to the next panel, which is going to show us a three-color stain. Now, we always use matched isotype controls to make sure that these exosomes aren’t too sticky as I think I mentioned a few times earlier. The precipitation kits would provide such sticky exosomes that the isotypes were as bright, if not brighter, and [Laughter] the marker of interest antibodies that we were using. When we first start seeing that, we realized we can’t use precipitation kits because the staining is meaningless. By using isotype controls, we could set these POD stats then look at the expression of our three reference gold markers. Top right panel shows that almost 90% of the exosomes in this preparation that we’re also positive for FM 1-43, so this is gated from that positive lipid gate are expressing 63 and 81. The bottom panel, when you do 81 versus nine, the nine isn’t that strong, probably about 60% have crossed that threshold. Again, every prep can show high variability in how much each of those markers are there. Sometimes you don’t have any on one of them. You might only have, say, nine and 81. 

00:44:05 - 00:45:45

We do all three because we need to assess what that particular preparation has on its surface. We’ll pick one that has the preponderance. In this case, we chose CD63, and then we’ll do extra staining tubes with two more markers of interest. We’ll put in our 63 antibody and in this case, we’re staining exosomes that we’ve isolated from the culture supernatant of cancer cells, MIA PaCa cell line actually, pancreatic cancer cells. In this case, we’re comparing cells that were grown in two different conditions, hypoxic and normoxic conditions. We’ve done all of the spinning with all of our culture supernatant from these two comparative cell lines and we’re looking for the markers CD44 and CD184. We actually saw quite a distinct difference in normoxic conditions, not all, probably only - let me look at the numbers here - 40% of the exosomes are actually staining positive for CD184. They’re fairly uniform for 44. All of them have 44 on them, but in normoxic conditions, we only see about 40% 184 positive. In the hypoxic conditions - and this is related to cancer stem cellness - the stemness of these markers increased in hypoxic conditions, and so now all of the exosomes are duo positive for 44 and 184.

Those are just representative figures of the kind of data that we’re generating. One quick comment on the choice of fluorochromes, you must strive to use only the very brightest fluorochromes. Those being the polymer violet dyes like erythrin, Alexa Fluor dyes, especially 647. It’s hard to find but sometimes - one of my favorites is also Alexa Fluor 546. It’s also one of the brightest ones. You really need good, good strong bright dyes because exosomes are small. They’re probably less than a hundred of the - or an antibody is probably less than one-hundredth of a size of an exosome, so you can’t really stick too many on the surface. You need each fluorescent molecule to give out much light as possible to be able to generate a signal that you can detect. 

00:46:52 - 00:48:01, Slide 13

I’d like to wrap up today by letting me know that in my opinion, exosome flow cytometry requires vigorous validation at several steps. You have to start off by characterizing your instrument and making sure that your background noise does not interfere with being able to resolve the small silica of particles that are, at this point in time, the best QC material we have. I know several companies are working hard on generating what’s going to be the next generation of QC material, which is going to be a stabilized small vesicle in the correct size, which is going to give us the ultimate material as a quality control standard because they have the same refractive index as our exosomes. For now, the silica microparticles are really the best we have of assessing our instrument.

00:48:01 - 00:49:54

I’ve shown you that there are a variety of methods to isolate exosomes. I strongly encourage you to try several of them. It depends on what kind of starting material you’re using. Plasma is much more dirty than, say, urine if you’re going to go after a biofluid. Culture supernatant is pretty easy and ultracentrifugation is pretty standard for that but try lots of things. See what in your hands gives you the best result. This is not something - and I have to say this with the strongest emphasis, if you’re thinking of starting working with exosomes, don’t think you’re just going to try it once and get some pretty data. We’ve been working on this for two years. This is not a trivial [Laughter] undertaking. You have to be very rigorous and methodical in the way you do things to really be able to prove to yourself and then convince yourself that you are truly seeing the correct particle. It’s been a lot of fun and I think that there’s no question that exosomes are the wave of the future for potential diagnostics, for figuring out intercellular communication and ways to interrupt that that might potentially develop new diagnostic or therapeutic effects for a myriad of diseases. We are very [Laughter] excited about the potential of this. We work predominantly with pancreatic cancer patient samples here and that’s the disease we’re going after.

00:49:54 - 00:51:04

I guess I’ll wrap up by saying our collaboration with the flow cytometry team at Thermo Fisher has really been wonderful. I think that our work together has greatly improved. The confidence that this next generation Attune NxT Cytometer has the ability to see microparticles and can reliably provide good surface antibody staining data. Probably the last thing I’ll say is that the new side scatter filter that we’ve come up with will be available in the near future. It takes a while, the big company, to come up with a new product and then part number and everything but it’s coming and you will have access to that in the near future just one small simple filter to buy to swap out on your instrument. I think that’s all I have. Thank you very much for your attention. I hope it’s been beneficial. I’d be happy to entertain any questions.

00:51:04 - 00:51:23

Thank you, Steve, for your presentation. A quick reminder for our audience on how to submit questions, simply type them into the drop down box located on the far left of your presentation window labeled Ask a question, and click on the send button. Steve McClellan will answer as many questions as time permits. 

00:51:23 - 00:52:32, Question 1

First question is what are the challenges of using polystyrene beads and silica beads as size standards?

I hate to patently endorse one [Laughter] product over another but Apogee really is the only company offering these small silica particles. There is a vast difference between the refractive index of silica versus polystyrene. It’s just generally accepted in the field that polystyrene does not offer a close enough true sizing standard to judge the size of an exosome. You must use the silica particles at this point. They really are the only thing available that’s close enough to the refractive index to really gain a true sense of what size of particle you’re resolving in your flow cytometer, and coupling that with the ultra-filtered sheath fluid really does make a huge difference in making that resolution happen.

00:52:36 - 00:53:46, Question 2

Thank you. Next one is did you consider instrument resolution limitations when designing your experiment? What consideration should be accounted for and considered?

I think really you have to start at the beginning. You have to start with the beads and prove to yourself that your instrument is capable of resolving those beads. I’m quite sure that some of the stuff I’ve told you today can be applicable to other cytometers as well, but until you do it yourself and prove that you’re actually resolving all of those little silica particles, there’s no sense in even doing any exosome [Laughter] preparations. You can’t put the cart before the horse. You have to validate that your instrument can actually resolve things in that 30 to 150-nanometer range.

00:53:46 - 00:55:55, Question3

Great. Thank you. Next one is what do you see as future directions in the exosome field and what role can flow cytometry play?

Wow. Last week, I attended the American Society of Exosomes and Microvesicles National Meeting in Baltimore and I was just astounded. It’s been several years since I’ve attended that conference and I was astounded at how many different disease states are actually involved in exosome work - everything from Alzheimer’s, neurodegenerative diseases, of course, cancer, diabetes, restless leg syndrome. I was astounded to see their poster on restless leg syndrome and they think that they can now predict which Parkinson’s and Huntington patients will be affected more heavily by that just looking at an exosome signature. It was just astounding. I have no doubt, like I said earlier, that this is the wave of the future. There are so many potential uses for these things as early diagnostic markers, predictors for potential therapy solutions. The sky’s the limit, and as I said at the very beginning, if you look in PubMed, it literally has been an exponential growth of reports of people looking at exosomes. Cytometry is going to have a much bigger part to play in this field, especially if we can move towards generating cell sorters that are more sensitive there. It has been done. That’s even harder. Oh my gosh, I can’t even imagine, but if we can then push to that point so we could now do this surface labeling and then sort out a specific marker on the surface and see what’s inside of maybe those exosomes. Sky’s the limit. It’s a very exciting field excited to be a part of it. 

00:55:55 - 00:59:09, Question 5

Thank you. Next one is how are your samples collected and were there any time restrictions?

For working with culture supernatant, it’s not that difficult. You do have to start with much larger volumes to get the number of exosomes back. On average, we would do all six places in the ultracentrifuge rotor with 50-milliliter tubes. We would try to spin at least 300 milliliters of culture supernatant to start with. That generates, in general, a pretty good amount of exosomes to work with. The interesting thing we’ve noticed along the way is that there’s huge variability in the number of exosomes that different cell lines secrete. Some don’t secrete very much at all and other just make hordes of them, so that’s something to consider as well. You might want to try several different cell lines to see what gives you the best yield to have material to work with.

I think I’ve mentioned in the beginning, we use patient blood and that’s collected in a green top heparin tube as plasma. It is critical that you do not use serum because the process of letting blood clot traps so many of the little exosomes in the actual hard clots so that when you just dump out the serum, there’s very little exosome left in it, so you really have to use plasma. Many of my colleagues use urine, especially looking at prostate cancer. That’s a very nice way to isolate exosomes because the urine is actually is pretty clean and pure of other things. Again, you can use cerebrospinal fluid. There’s lots of different things you can do.

One thing that we did test early on and we feel is very critical for obtaining blood from patients is that we want to stabilize that plasma with a protease inhibitor as fast as possible. We strive to have that tube of blood in our hands within 30 minutes of it coming out of a patient, spin it down, and we immediately add a standard protease inhibitor tablet to a certain concentration, whatever recommended. I’m not even sure where we get ours from but that’s a pretty standard thing. We’ve done this side by side. We’ve done time test, the blood sat at 30 minutes to an hour to four hours with or without protease inhibitors and we do see a strong loss of surface markers on the exosomes if we don’t have those protease inhibitors in there. I probably should’ve mentioned that in the main part of the webinar. It’s very important to have those there to help preserve the structure of the exosomes. 

00:59:09 - 01:01:40, Question 6

Great. Thank you for that. The next one is I’m really confused by terminology, EVs versus MVs, versus exosome versus apoptotic bodies etcetera. What is the standard terminology of small particles?

It’s crazy. I really think it’s best to maybe go back and look at that slide where I’ve got the three successive spin speeds lined out because that’s the best way to just wrap your head around it. At each higher level of g-force, you’re going to pull down a smaller and smaller particle. The largest of these are considered apoptotic bodies. These are blebs that are spit out of cells as they’re undergoing the final processes of apoptosis, where the cell is literally ready to break down. I mentioned large oncosomes which are somewhat controversial in the field, actually, but those are lumped into that large category. Medium category is always going to be considered a catchall. Microvesicles can be anywhere from about 150 up to 1,000 nanometers. That’s a pretty big size range. We do believe that the way they’re generated is pretty uniform, from the budding of the exterior of the membrane and then an exosome is never bigger than 150. Some people are a little more strict and call that 100-nanometer on the top end on say 30 to 100 nanometers.

We’re pretty confident on how those are developed as well the invagination and then the multivesicular endosome body that then refuses to the membrane and releases those little particles out to the surface. There have been groups that have done just phenomenal live cell imaging. You can actually pre-label the membrane with some of these dyes I’ve talked about, lipophilic dyes, and then literally over time-lapse videography, watch exosomes being released from cells in live time. It’s a phenomenal thing. We know how it works. We can watch it. Really, you just think about small, medium, and large. That’s the best way to think of it.

01:01:40 - 01:03:20, Question 7

Well, thank you. Looks like we have time for one more question. How did you determine the optimal type of antibodies to use in this application?

Really no different than titration of antibodies on cells. It’s very important to try a variety of different concentrations. What you’re looking for is a signal to noise ratio where if you have antibodies that too high of concentration, the negative cells will creep up and be brighter. Optimally, what you’re trying to do is - and the process of titration is really to maximize the space between the negative and the positive population. When you do your titration of different concentrations, that’s what you look for. Then at a certain point, of course, you’re going to start to drop off your signal, so you know that’s gone too far dilute. There are many, many references to talk about how to properly do titration of antibody out there and just search the web and I’m sure you can point four or five, but it’s very important to try titration with all of these epitopes, especially in our case where we were trying to minimize the amount of antibody to get a signal in this no-wash particle. We found that to be very, very critical. You can’t just go add a whole bunch of antibody in there and not wash it back out.

01:03:20- 01:03:56

Well, thank you. I would like to once again thank Steve McClellan for his presentation and I’d like to thank the audience for joining us today and for their interesting questions. Questions we did not have time for today will be addressed by the speaker via the contact information you provided at the time of registration.

We would like to thank our sponsor, Thermo Fisher Scientific, for underwriting today’s educational webcast. This webcast can be viewed on demand through May 2019. LabRoots will alert you via email when it’s available for replay. We encourage you to share that email with your colleagues who may have missed today’s live event. Until next time. Good bye.

Panel Design for Multi-Parameter Flow Cytometry

Join Carol Oxford, FAS, in this presentation on panel design. The correct controls makes or breaks your multicolor panel. Learn the value of including the correct controls and reagent titration. Understand the importance of fluorescence spillover and finding your fluorescence spectral balance.

00:00:00 – 00:01:18, Slide 1

Moderator: Hello everyone and welcome to today’s broadcast. Experimental design best practice for multicolor flow cytometry presented by Carol Oxford, field application scientist, Thermo Fisher Scientific. I’m Xavier and I will be you moderator for today’s event. Today’s educational web seminar is brought to you by LabRoots and sponsored by Thermo Fisher Scientific. For more information on their sponsor, please visit www.thermofisher.com. Now before we begin, I would like to remind everyone that this event is interactive. We encourage you to participate by submitting as many questions as you want, any time you want during the presentation. To do so, simply type into the “Ask a question” box and click on the send button. We’ll answer as many questions as we have time for at the end of the presentation. Also, please notice that you will be viewing the presentation at the slide window. To enlarge the window, click on the arrows at the top right hand corner of the slide window. If you have trouble seeing or hearing the presentation, click on the support tab sound at the top right at the presentation window, or report your problem by clicking on the answer question box located on the far left of your screen.

00:01:18 - 00:01:55

This presentation is educational and thus offers continuing education credits. Please click on the continuing education credits tab located at the top right of the presentation window and follow the process to obtain your credits. I’d like to now introduce our presenter Ms. Carol Oxford. Carol Oxford is currently a field application scientist with Thermo Fisher Scientific. She has worked with FlowJo LLC and was one of the founding partners of ExCyte, a flow cytometry education company. Ms. Oxford, you may begin your presentation.

00:01:55 - 00:03:03

Thank you so much, Xavier. I really appreciate the opportunity to come talk to you about flow cytometry. I’ve done this for quite some time. I was also the UC Davis Flow Cytometry Core director there for about 25 years. I think flow cytometry is one of the most elegant technologies in science. When I did flow cytometry, multiparameter meant two fluorescent parameters and now the people at the cutting edge of flow cytometry are doing 32 fluorescent parameters. It’s changed quite a bit during that time. I’m really thankful to all the mentors that I had and so now I have a lot of experience and hopefully you’ll find some information here that’s helpful to you as you design your multiparameter panels. I put my email address here on the slides, so that you can contact me after the seminar. If you want any information that I give here. You can also email me about other topics in flow cytometry. I can give you some resources as well in the seminar.

00:03:03 - 00:03:54, Slide 2

Let’s just talk about flow cytometry in general. There are advantages and disadvantages of course to every technology that we use in science. There are a lot of advantages to flow cytometry. One of the things that I like the most about this technology is that we get a lot of information on a per cell basis. We use a population of cells that have a lot of heterogeneity, but we’re doing this on a per cell basis. It’s extremely quantitative. We look for statistics like population frequency and expression level on a large number of cells. We can look at tens of thousands or even millions of cells, in an extremely short period of time. We can generate this data very quickly.

00:03:54 - 00:05:57, Slide 3

Of course there are some limitations of flow cytometry. Despite our best efforts, the cells must be in suspension and those of us that have certainly run a core facility or run our own data, we realized that putting clumps of cells or putting pieces of tissue in a flow cytometer quickly ends that experiment. We spend a lot of time fighting with the flow cytometer. We have to get the cells in suspension. We have to make sure that they’re filtered or we use enzymes to make sure that the cells pass though the flow cytometer. The other limitations are we get no spatial information unless we use a complimentary technology like microscopy or an ImageStream which gives us spatial information about where the cells are. To flow cytometrists, cells are an electronic pulse. We don’t get any information about where the fluorescence is inside the cell.

00:04:50 - 00:05:57, Slide 4

One thing that’s interesting about flow cytometry data is it’s very easy to misunderstand the data. I get this all the time from clients and students who tell me that, I’ve had 30 years of experience, so they want me to look at their data or they want me to analyze their data and what I want to present today is that you don’t have to have 30 years of experience to get good flow cytometry data. This data is not subjective. It’s very important to have the adequate controls to understand a little about physics and a little about antibodies, and to do a very careful analysis. People that are just starting out in flow can do the analysis the same way somebody that has 30 years of experience. It’s just really important and sometimes in fact most of the time, you’ll have more controls than you will have samples. It’s just really important to follow some important steps to get really good flow data and we’ll talk about most of those here.

00:05:57 - 00:06:32, Slide 5

Before you panic, this slide is extremely busy and it has a tremendous amount of applications in flow. I know there were a lot of times in my career that I thought, “We’re pretty well done.” We have a number of applications but we’re kind of stuck. Then fluorescent proteins came to the table. Certain dyes came to the table and probes came to the table. All of a sudden we were off and running again.

00:06:32 – 00:06:42, Slide 6

I’m certainly not going to go over any of these applications in particular, but I wanted to say that if we can develop a fluorescent probe or a fluorescent marker, the list is almost endless in the applications of flow cytometry.

00:06:42–Slide 7

Let’s talk about flow cytometry best practice. These are the things we’re going to talk about today. They don’t encompass all of flow cytometry best practice, but they’re certainly important to panel design.

00:06:59 - 00:07:46

The first thing we’re going to talk about is how do we know the machine is performing optimally? Certainly, the manufacturer of flow cytometry instrumentation is going to develop a protocol, to give you an idea of whether the machine is performing optimally. We’re also going to talk about what you might want to do above and beyond this level of testing to make sure you’re going to get good data. We’re also going to talk about how to choose fluorophores. It’s very important to understand some of the characteristics of fluorophores that might make you choose one over the other and give you an advantage in panel design. We’re certainly going to talk about titration. This is one of the most important and most simple concepts that will give you good data and make panel design much easier.

00:07:46 - 00:08:32

Then of course getting along is important in the work place. It’s important in relationships and it’s incredibly important in panel design. Not all fluorochromes get along and we’ll talk about which ones do, which ones don’t and how you can get them in a panel together. Then of course control issues. Me, well I didn’t think I had control issues, but when you become an expert in multicolor panel design, you’re going to have to have control issues. You’re going to run a lot of controls and you’re going to want to know what each one is for, and you’re going to have a lot of them. We talked about that already and you’re probably going to be sick of how much I talk about them in this talk, so bear with me. You’ll still love flow cytometry by the end of the talk, I do, but we do have a lot of control.

00:08:32 - 00:10:30, Slide 8

One of the most important things to understand of course and when you’re using any machine is what does it do? Now, I’m not going to get into the heart of physics because well, I shouldn’t say I don’t like physics, but I’m not a physicist. I understand the least I need to know about physics, to understand how the machine works. In the top upper left corner of the slide there, you’ll see what happens when you open the machine. The Attune NxT Flow Cytometer, which I represent as the field application scientist. Here you can see the filters of the machine and you can see the lasers and some of the optics. It looks pretty complicated, but it all boils down to the detector. What is the cytometer doing? It’s counting photons. We measure the photons that are excited when the laser excites the fluorochrome and the fluorochrome emits photons. The photons hit the photocathode at the front of the PMT. Those electrons that are now knocked off that cathode go through a dynode cascade and they’re amplified. They then are emitted off the other side of the PMT and we can measure those with an electronic pulse, very, very straight forward. All you need to know is that. We quantify the photons, we measure the electronic pulse and there you go. If you know a little bit about physics and a little bit about photons, you’re now a flow cytometry expert.

00:10:30 – 00:13:33, Slide 9

When we started in flow cytometry we used analog instruments. We didn’t know a lot about PMT’s or at least I didn’t. We had a four-decade scale to graph our data. We could walk up to an instrument, we could take an unstained sample, put it in that magic place, that we know is no longer magic, that’s first log decade on our scale. We knew that most of our fluorochromes, most of our samples would fit in that space and maybe sometimes we had to turn the voltage down, but otherwise we got pretty good data. Then we started using analog to digital hybrid machines and then fully digital integuments. We saw all sorts of things happening. We saw that if we crank the voltage up, there were populations appearing that we didn’t see before. We saw our background increase sometimes, we saw all sorts of things. We realized that we needed to understand a lot more about PMTs. We needed to set voltages that were appropriate for our experiments. We needed to do some optimization that’s now been called either a voltage walk or a voltration. Voltration sounds really cool. If you print it on a t-shirt it makes you very popular at parties. I’m just kidding. I also find my jokes really humorous. What does a voltration do? It assures the best signal to noise detection which is what we want to do. Our data is important to know the difference between background and negative and positive. We also want to assure all the measurements are being made in the linear range of the detector. Before when we only had four decades and now we have instruments with five decades, six decades and seven decades of range. We need to know how many decades of real data we get from the electronics. So decreasing the voltage below the minimum voltage that we find in a voltration, the resolution of dim populations may be lost. Increasing the voltage for a given PMT above this voltage gives no advantage in population resolution. I get a question all the time also from customers, why not just put a sample on and increase the voltage until the positive is almost off scale? Which is really what we first did. If a low voltage is a problem because we’re detecting the noise of the instrument, why not just crank the voltage up? Then we realized that that instrument noise can also become a problem and we get spill over spreading from the machine. I’ll explain this as well.

00:13:33 – 00:13:44, Slide 10

This is one way to do a voltage optimization. We use here our UltraComp beads and I’ll talk about those in a minute.

00:13:44 - 00:14:17, Slide 11

Basically, they’re antibody capture beads and they contain a binding bead and a non-binding bead. These beads bind to the variable region of your antibody. So you can put your antibody in the tube, put the beads in the tube and they have two populations. They bind to the variable region, so here you can see at the PMT voltage of one volt which is basically off. We haven’t applied enough voltage to the PMT to get a signal. Here you can look that there’s basically only one population.

00:14:17 – 00:15:00, Slide 12-18

Both of those beads are not being excited enough to see and we start walking the voltage up, very straight forward. As we get two populations to appear and then the beads start to go off scale. We lower that voltage difference to about 25 volts. The detector here if you can’t see the slides very well, the detector is the BL1 detector on the Attune and that is excited by the blue laser and it emits in the emission filter there is 530/30, so collecting green excitation off a blue laser.

00:15:00 – 00:15:06, Slide 19

If we do what’s called a concatenation in FlowJo, you can look here and see some interesting things about the data.

00:15:06 - 00:15:40, Slide 21

You can see that down at the low end, at 250 and 300 volts, the data is not linear. You can see the data starts to become linear when we hit 350 volts. We know we want to be above that range. You can also see that maybe somewhere around 375 volts, that background starts to spread. That’s going to become a huge issue when we start using multicolor panels.

00:15:40 - 00:15:50

Here, how would we decide what voltage to use? I can pick a magic red arrow there and I can tell you that’s what voltage I’ve picked and I’ll tell you why in a minute.

00:15:50 – 00:16:15, Slide 22

Here’s the voltage optimization that’s been done on an Attune NxT and I’ve picked or I’ve walked five PMTs off our blue laser, three off the red and three of the violet. Now you can do this on your machine as well, you walk each PMT individually and this is what you get when you visualize the concatenation in FlowJo.

00:16:15 – 00:18:06, Slide 23-25

What in the world do you do with all that data? There’s a number of things you can do. What we did and we submitted an abstract at CYTO and I can certainly send that to you if you want. It will also be published in our BioProbes and I can send you the link to that as well on our website. Here is the data, looking again at that BL1 PMT and also the YL4 PMT. The YL4 PMT is yellow excitation and the channel that you would – sorry, I don’t know the filter there. That’s the excitation of PE/Cy7, but that’s long red emission off the yellow laser. We looked at several metrics here, the stain index which I’ll explain in a minute. I’ll give you the formula for that. You can Google Marty Bigos and stain index for the publication on that. We also looked at separation index, we looked at robust CV of the positive signal. We also looked at two times the standard deviation of the negative signal. There are a lot of metrics you can look at here. Basically, what you want to see is you can look at where the robust CV starts to decline and flatten out. Then you can also look at where the peak of the stain index is and where that starts to flatten out. Here for the YL4 PMT for example, that starts to happen at about 400. These are some various methods and metrics you can look at for the voltage optimization. If you look at these graphs, you can certainly argue that there are several voltages that might work and that’s a valid argument.

00:18:06 - 00:18:18

The next step in this process is to take your data, run them at voltages here a minimum voltage, say 400, 375, 425 and see how this affects your data. Look at some data with low expressing markers and see if it makes a difference. The real point in this optimization is to find a minimum baseline voltage. You may adjust those voltages up or down but probably not. This is a good place to starts every experiment.

00:18:18 - 00:20:18, Slide 26

Here’s another option for your voltage optimization. Here you have a hard-dyed bead. The bead that I use for this optimization is the peak 2 of the Spherotech 8-peak bead set. This is a hard-dyed bead, it is a single peak so I don’t have two peaks here that I have to set a gate on and I did some very simple metrics. I used the robust CV and the robust standard deviation and I looked at those graphs simply in Excel and I looked at the range. I didn’t look at the inflection point, that’s also an interesting point, but I looked basically at robust CV where that leveled off. I looked at where the robust standard deviation started to rise. For example for this PMT, I would probably pick a voltage of about 350 for my minimum baseline voltage. The reason I would pick that voltage is I get a good flat CV so I know increasing the voltage there would not give me an advantage and I have a very low contribution of the electric noise measured by the standard deviation. I’ve done this on several occasions, on multiple occasions with multiple machines. I’ve also run data here and that suggests that I get a very good signal and I also get a good data here.

00:20:18 - 00:21:26

The next step that I do in this optimization because I take another hard-dyed bead for example the peak 5 bead and I walk my instrument for all PMTs doing this optimization. When I find those voltages, I make a histogram with all of my PMTs. Then I run the voltages from the optimization and I run my slightly brighter bead, my peak 5 bead at all of these voltages and I set a gate. I run this every day and I use the voltages every day for my experiments. That way, if my instrument changes, if anything changes in my experiment, I can say to the FDA or to the reviewer of a paper, that I have measured the same fluorescence every day. This is a great way to not only make sure that your instrument is characterized, but make sure that you are characterizing your experiment as well.

00:21:26 – 00:21:44, Slide 27-29

Let’s review the voltage optimization. We want to optimize our PMT voltages, so we have a minimum voltage where we are always detecting a voltage or detecting our cells and our signal above background.

00:21:44 – 00:22:17, Slide 30

If we do need to go below this minimum and why would that ever happen? If I put on my cells and I realized that now my positive signal is off scale. If I haven’t titrated or maybe I’m using GFP and I just can’t get that signal on the scale, it’s most important that you have to get it on scale, so you’re going to lower your voltage. If I have to do that, I know that I may be sacrificing signal to noise, but at least I have an understanding of that.

00:22:17 – 00:22:22, Slide 31

With GFP if I’m only looking at something bright, signal to noise may not be my most important consideration.

00:22:22 – 00:22:24, Slide 32

If I raise the voltage, there’s no advantage to resolution.

00:22:24 – 00:22:32, Slide 33

I don’t want to change these just for resolution and optimization is very important so just get it done.

00:22:32 - 00:22:41, Slide 34

If I do this on an Attune NxT with the hard-dyed beads, it takes me about an hour. I throw the numbers into Excel and it’s very easy.

00:22:41 – 00:24:36, Slide 35

Now we have our flow cytometer ready, what’s next? There are many ways to do panel design. There’s an excellent paper by Yolanda Mahnke and Mario Roederer. I believe it’s from 2010, but if you search panel design in those two names M-A-N-H-K-E and R-O-E-D-E-R-E-R, you can find the paper and it talks about ranking your markers. You want to choose the markers which best answer your research question. They rank them primary, secondary and tertiary. I find that very helpful. It’s not engraved in stone, but it’s very helpful because again I use this little rhyme, “The more colors you use, the more sensitivity you lose.” It’s counterintuitive but the more fluorochromes you add to a panel, you encounter a concept called background expansion because of spill over spreading. This is a problem of physics, it’s not something that compensation corrects for, not something we can avoid. If you can answer the question with less markers, it’s always an advantage to do so, but of course nobody wants to do that, right? We want to use more markers and we’ll talk about how to do that. You’re going to want to titrate all your reagents, you’re going to want to screen for the best ones. What are the best ones? Of course they’re the ones that separate the population best, but also sometimes the same reagent conjugated to a different fluorochrome is going to give you a different staining pattern. You want to test all the combinations that you can, you want to troubleshoot issues with sensitivity, I’ll give you some tips there and then you generate your data.

00:24:36 – 00:26:08, Slide 36

Of course, this goes back to high school science, or maybe even elementary school science. We determine a biological hypothesis. Which antibodies do we need for markers of interest? You’re going to do a literature search and there’s a really helpful source that you can do this, they’re called OMIPs. I’ll talk about that in a minute and show you where you can find them. The most important thing is the panel design is iterative. Predictive panels are exactly that, they’re predictive panels. No matter how much expertise you have I can give you a predictive panel, but I can’t tell you if it’s going to work or not with any confidence. I used to teach week long courses in flow cytometry and every single class there’d be people disappointed at the end that I couldn’t hand them a panel that worked. I certainly can hand you panels that I’ve done. I can give you panels in the literature that other people have done, but they have a successful chance of working, but you are not going to do it - unless you want to repeat their work. You’re not going to do the same thing on the same machine with the same cells. You’re certainly going to have to do this in an iterative manner. The idea is that you’re going to do a backbone panel that always works and then you’re going to add antibodies and add markers and see if that backbone works until you can add as many markers as you need.

00:26:08 - 00:27:48, Slide 37

There is a lot of help out there. The flow cytometry community is really, really collaborative and OMIPs were designed or actually introduced by the reviewers of the journal Cytometry. I think there are more than 50 now. These are open source and they were designed because no matter where you publish a paper, you can never get all the information that you need from the journal’s method section. Let’s say I want to publish a paper in nature, “Carol’s T-cell killed cancer.” Unfortunately, you can never get all of the important details out in the materials and methods section. I can get another paper, a two-page paper called an OMIP, Optimized Multicolor Immunophenotyping Panel and I can get that or I can publish that in Cytometry. This gives all of the detail that flow cytometry nerds want to see, right? You can see important things like the gating strategy, the clones, the panel design. Did I have any difficulty with certain antibody combinations? Here you can see in the gating strategy, I don’t use a forward and side scatter gate to determine any of my cells of interest, I just use it as a debris elimination gate. I gate first on my doublets. I gate second on my dump channel and viability dye. I use different types of plots. I use maybe a plot that shows me some back gating that I did. It’s really, really critical, you can use these as reference.

00:27:48 – 00:28:40, Slide 38

We also keep track of these on our website. You can look at our flow cytometry resource library which has a tremendous number of resources for both people that are new to flow cytometry and people that are very experienced. We have flow cytometry application notes, techniques. We have videos so that if you are new to flow cytometry, you can figure out how a flow cytometer works. We also keep track of all the OMIPs here because if you Google them, sometimes only a few of them come up. I really appreciate them because when I studied T-cells, they were just activated or producing cytokines. Now, people ask me about panels for T-cell exhaustion. They weren’t exhausted when I was studying them. You guys have the hard work to do, right? The old people in the field, we did the easy work.

00:28:40 – 00:28:54, Slide 39

These will really help you with your research. I talked about the stain index. This is the formula for stain index, basically the median of the positive minus the median of the negative, over two times the standard deviation.

00:28:54 – 00:29:00, Slide 40

It’s a metric that allows us to look at the separation of our population. There are many of them out there, but this is one that we would commonly use.

00:29:00 – 00:30:05, Slide 41

Antibody titration, people always ask me what’s the best way to titrate an antibody. Any way you want to, right? There are a tremendous amount of protocols out there for this. one of them that I like is by Ryan Duggan, if you Google Ryan Duggan and titration and Alexa Fluor 488. He has a great protocol that actually tells you the amount of PBS to put in a well when you’re titrating your antibodies. This is a titration that was done by one of our scientists. It looks at serial delusions of a one to two concentration of antibodies and then the graphing of the stain index. This is where there’s a lot of mythology about antibody concentration. Here’s the graph of the stain index that you can see and when I was new to Flow.

00:30:05 - 00:30:09, Slide 42

I would have said, “I would look at this and pick one of the higher concentrations of antibodies where that stain index flattens out, because I want to be at saturation, which might be the concentration here.

00:30:09 - 00:30:53, Slide 43

Then now, I’ve learned that you can use a concentration here. There’s a difference between a saturating titer and a staining titer. The one thing you’re trying to do here or maybe you’re trying to do is to make sure that in the population that you’re trying to detect, you’re detecting the percent positive. Now if you’re trying to see the population shift or the MFI shift, you’re going to want to use the saturation titer, that’s a different case. Most of us are just trying to say I have 20% of these or 50% of those.”

00:30:53 – 00:30:58, Slide 44

In that case, you may want to use a separating concentration.

00:30:58 - 00:31:56, Slide 45

The saturating concentration, if you increase the antibody, you only increase the non-specific background. People always ask me, “Why can’t you just wash more?” Washing does not eliminate nonspecific background, at least it doesn’t very well. Washing is a great way to lose cells. You usually lose about half of them in every wash. It does eliminate excess fluorochrome, but that’s really not the important thing that you want to do when you’re titrating and staining. A separating concentration gives you good separation of your cells. It reduces your spreading error and it saves you antibodies. You can often use the separating concentration when you’re only looking at the percent positive, which is really what the vast majority of us do, when we’re doing immunophenotyping.

00:31:56 – 00:34:44, Slide 46-47

Let’s talk about spill over, compensation and spread. These terms are thrown around a lot in flow cytometry. They’re sort of the language that flowers speak. You’ll also hear the term spillover spreading. Bear with me a minute as I talk about something that’s not on the slide here. When I talk about compensation I often use FITC because it’s a fluorochrome that we’re all really familiar with. When I ask what color is FITC, you say green. I just set you up for failure because that’s a trick question. FITC is not green. We think FITC is green because we’ve been scammed. FITC is green or we think it’s green because we look at it under a microscope and we have a filter on that microscope that only allows those really energetic green photons back to our eye. I feel like making a Netflix type documentary. The truth about FITC is that if we didn’t have that filter, the emission spectra and here you’ll see the emission spectra of PE/Cy5.5 on your screen – I’m sorry, maybe it’s PerCP-Cy5.5, yes it is. An emission spectra is a probability curve, so going back to FITC, if I hit FITC with a blue laser, the chances of FITC or the probability of FITC emitting green photons is very, very high. FITC also has a very good probability of emitting yellow photons or orange photons or even red photons. I have a flow cytometer that very well detects yellow, orange and red photons. It’s very easy for me to determine where those photons are coming from if I have a single stain for every color that I use in my experiment. We now use compensation which is an algorithm to separate the emission of these fluorochromes and visualize them in multi-dimensional space. That sounds really tricky and the math might be too complicated for me to do in my head, but it’s very, very easy in practicality.

00:34:44 - 00:34:54, Slide 48

Here is two fluorochromes that we used PerCP-Cy5.5 and APC Alexa 700 and here’s the example of where the photons are that are admitted by one and detected in a neighboring PMT.

00:34:54 – 00:35:53, Slide 49

When I used a single stain PerCP-Cy5.5 sample, all of the cells or all of the beads whichever this is, are detected in the detector, the primary detector where I want to make a measurement, they’re also detected in the secondary detector where I’m going to eventually make another measurement and this cell gets a number here and it also gets a value in this detector. Compensation just uses an algorithm to force the median of this value on the Y axis to be the median to be equivalent to the median of this - on this axis as well. We could have chosen another algorithm to do this but we chose compensation. Compensation allows us to visualize the effects of spillover in multi-dimensional space.

00:35:53 – 00:36:01, Slide 50-51

Sounds complicated? It’s not. It just uses a vector and the slope of this line to do the calculation.

00:36:01 – 00:36:40, Slide 52-56

Here we have a median of 1,277 and 60.2 – I’m sorry. We have an MFI here on this axis. We use compensation in our single stain sample to adjust these medians to be more equivalent and we have a fully compensated sample.
Now, when this cell gets a value here and this cell gets a value here, I can’t change that value. I can only bring it down to the lower end of the log scale and now I visualize this spread.

00:36:40 - 00:37:01

It’s not visible on this because – it’s not visible here, sorry, because it’s on the high end of log space. It’s much more visible here. This is what is called spread or spillover spread. I don’t really like those terms but we’re kind of stuck with them. That’s what you see here.

00:37:01 – 00:37:5, Slide 60

This is an important example looking at uncompensated and compensated data. Usually, we can recognize that the plot on the left is uncompensated by the strange diagonal and on the right, we only know the data is compensated because it’s marked uncompensated. We really don’t have any way to statistically determine that those medians are equivalent in the compensation control. It’s important to know that we’ve used the right compensation controls and we don’t know on the left that the population that is dimly stained for CD11b and brightly stained for CD11c is indeed double positive. We want to make sure and use the right compensation controls to make sure we have good data.

00:37:5 – 00:39:22, Slide 63

The process of compensation is very easy even though the math is kind of complex. Compensation beads are really the easiest way to do compensation. They work for all lasers and depending on the manufacturer, they work for every antibody. The beads that we carry, the E-bio UltraComp eBeads work for mouse, rat or hamster, the AbC beads also work for rabbit antibodies. The features that are nice about the beads, the unstained bead fluorescence is very similar to most cells and they give you a bright stain and a clear separation. The rules of compensation of course are that you have to have a single stain for each flour in your sample or in your panel. You need to have a negative and a positive either in the tube or in the algorithm, so you can use an unstained. It used to be important that you had the samples as bright or brighter but when I talk to experts in the field like Dave Parks who writes the compensation algorithms in the software, he said the algorithms should calculate that math for you. I think it’s still probably important to have a very bright and well separated population and the beads do this for you.

00:39:22 - 00:39:40

It’s helpful certainly if the target is not highly expressed on your cells or expressed by only a few cells. For example, if you’re using a cytokine or FOXP3 or something that’s not well expressed, these compensation beads work really, really well.

00:39:40 – 00:39:57, Slide 76

Now that we’ve got our compensation controls done, our instrument voltages are set, we want to talk about some basic strategy for fluorochrome selection. We wanted to save the brightest fluorochromes for our dimmest markers.

00:39:57 - 00:40:59

How do we know that our markers are dim or bright or if our fluorochromes are dim or bright? I’ll talk about that on the next slide. We want to save our important targets which we’ve ranked with the paper that I discussed. We want to save these for the brightest fluorochromes. APC, PE, Alexa dyes, Super Brights - these are the worst resolves. They have low expression or poor antigen access or they have an unknown expression level. If I want to look at antigen X, I may not know what the expression level is. Then, I want to minimize the spillover. I think about this as part A and part B of panel design. I like to have a spillover spreading chart that’s part B and then I like to have a stain index chart, that’s part A. I want to look at stain index and spillover spreading. Let’s go to the stain index slide.

00:40:59 – 00:42:34, Slide 61

Here’s an example of the ranking of fluorochromes by stain index. This is one that we did at Thermo Fisher. We took a freshly isolated PBMC and stained them with anti-human CD4 conjugated to various fluorochromes. Now this of course will vary manufacturer to manufacturer, it will vary instrument to instrument. What we’ve listed here is the fluorochrome, the laser, the filter and the stain index that we got with the Attune NxT Flow Cytometer. You can certainly use this on your instrument. These things will be relative instrument to instrument. It is best to use it on your instrument and certainly different manufacturers will have this for their instruments and their fluorochromes. I like to have this print out in front of me so I can see the brightness of fluorophores. This also depends on several things. Fluorophores are bright because they’re quantum efficient. For example, FITC is very quantum efficient. When you hit FITC with a lot of light, you get a lot of photons but FITC unfortunately can be a problem if you’re looking at an auto-fluorescent cell line because you also have, of course, auto-fluorescent and auto-fluorescent protein, auto-fluorescent proteins emit in the green. The brightness of this only accounts for the stain index and not necessarily for the auto-fluorescence. That’s another thing you we want to consider but this is a great chart to have as you’re doing panel design.

00:42:34 – 00:43:40, Slide 77

Here we’ll start up talking about spillover spreading. We talked a little bit about looking at fluorochromes that have emission spectra that overlap. On the left, you’ll see an example of CD3APC Alexa Fluor 780 and CD56 APC. You can see that those emission spectra overlap a bit and if you use these on cells that are co-expressing the same markers, you can certainly see that the pattern that you get even if you use an FMO, it would be very difficult to see how to quantitate those populations. If we separate those co-express markers on to fluors like eFluor 450 and APC, we can see a much better staining pattern that gives us a much better idea of how those cells are co-expressed.

00:43:40 – 00:43:50, Slide 78

This is a classic example of two fluorochromes that are used quite often in panel design.

00:43:50 – 00:44:41, Slide 79

Here’s an example of PE and PerCP-CY5.5. PE is a very, very bright dye and it’s often that manufacturers when they come out with a new antibody, the first thing they do is conjugate it to PE and I’ll tell you that PE is possibly the worst place to make a sensitive measurement in your cytometer. Now PE is very bright, we use it all the time. PE is also the place that accepts a lot of error from the other fluorochromes that you use on your cytometer and I’ll show you why. Here we have a PE single stain and we’re measuring that in PE and in PerCP-CY5.5.

00:44:41 – 00:45:01, Slide 80

Again, as each cell goes through the cytometer, it gets a value in every detector. When we compensate that data, you can see the spread of this data. Compensation lets us visualize the spillover and the spread.

00:45:01 – 00:45:17, Slide 81

If we used a gate on the unstained population, obviously it looks like there’s a population of double stained cells on the right. If those aren’t double stains we can certainly see that, we would have to use an FMO but these cells are definitely there.

00:45:17 - 00:45:52, Slide 82-83

There’s an easy way to do this in multi-color space. This is a neat trick. Here instead of thinking about how bright our stain is which is part one of panel design the stain index, now we’re going to think about if we have to have, for that Nobel prize, or that paper in nature, we have to have a PerCP-CY5.5 or a FITC antibody and here’s the FITC antibody I absolutely have to have, and here I’m looking at that antibody in FITC and in PerCP-CY5.5.

00:45:52 – 00:46:40, Slide 84

Now, I add more antibodies to this panel, I’ve added that the two antibodies that I cannot live without and I’ve added APC-Cy7, BV421, PE/CY7 and the other antibodies that I have to have in my panel. Now, with all of these dyes in the panel, I can still use the two antibodies I need without any problems. You can see the background expands a bit from spreading of the other fluors but no matter where I set a gate, I can easily determine the population percentage of those two dyes.

00:46:40 – 00:47:30, Slide 85

Now, I add PE BV605 and BV711 and now I have a problem. With the two antibodies I’m using, the FITC and the PerCP-CY5.5, if the staining is as bright as it looks here they’re not a problem but if the staining is dim, now I have fluorescence all the way out to the third log in the PerCP-CY5.5 channel. The FITC channel is virtually ineffective. FITC is one of those dyes that plays nicely with others. It does not accept error from any of these dyes certainly but the PerCP-CY5.5 channel, there is background expansion and right now all I know is that it’s from PE or BV605 or BV711. Now certainly, the more experience you get with panel design, you’ll know where it’s coming from.

00:47:30 – 00:47:39, Slide 86

This little trick can tell you exactly where it’s coming from.

00:47:39 – 00:47:50, Slide 87

Here, I can look at every single stain sample and I can look at it in my FITC channel and like we saw before, all of these single stain samples look fine when we detect them in the FITC detector.

00:47:50 – 00:48:47, Slide 88

Now, when we look all of the dyes, single stain and concatenated in FlowJo, we can see exactly where the problem is, PE and BV711. When I add these two dyes they contribute quite a bit of background expansion to the PerCP-CY5.5 channel. I know that if I’m going to use an antibody in PerCP-CY5.5, it has to be brighter than that 10^3 because if it’s dimmer than that, I just won’t see it in multicolor space unless I use it on different cells, then I’m staining with the PE and the BV711 and our new Super Bright dyes that also emit off the violet channels would have the same characteristics because they have the same emission or very similar emission spectra.

00:48:47 – 00:49:33, Slide 89

Here for example are all the same dyes in the BV711 detectors, you can see PerCP-CY5.5 is going to be causing some background issues for BV711, and here you can see all the dyes that cause background issues from PE/CY7. Again, PE/CY7 is very bright and often used in multicolor panels but you can see it’s not a place that you would necessarily want to make a dim measurement. These are very important measurements to make and understand which dyes cause contributions and which dyes you might want to either keep separate or save for your brightest markers.

00:49:33 – 00:50:53, Slide 94

Let’s just do a quick spreading error review. These are basically caused by photon counting statistics. I like to think of photons like kids. Basically, we really like energetic kids. We like blue, green, violet photons, they have tremendous amount of energy. When they hit that cathode, they knock off lots of electrons and as scientists, we like lots of numbers, right? We can count those, we get great statistics, we get very low variation. Unfortunately the red photons are like teenagers. I love my children I raised them through their teenage years but they’re a very bad investment. You put lots of energy in, you never get that energy back, right? There’s loss through rotation, translation, heat energy and when they knock off electrons, sometimes they knock off a little, sometimes they knock off a lot and when we count them, there’s just this spread, right? We can still use them. There are advantages to using them but we need to understand that we get the spreading error and there is a way to predict it and there’s a way to use them wisely.

00:50:53 – 00:51:36, Slide 95

Here’s an example of a great paper in Mario Roederer’s Group and this is a calculation of some spillover spreading matrixes. Now, this is another paper that is very complicated and if you have to do the math by hand I don’t envy you. We used to do this on all of our machines in the lab and it took a lot of time but it’s very valuable data. Here you can see where the fluorochromes don’t get along and where you’re going to have issues. These are on LSR II instruments and looking at QDots and some of the other fluorochromes.

00:51:36 – 00:53:13, Slide 96

This is an example of a spillover spreading matrix that I generated in FlowJo with some of the scientists at Thermo Fisher. This is a very easy thing to do in FlowJo. You take the data of a single stained, these are CD4 on an Attune NxT V6 configuration and these numbers give you some warning signs based on the red color. For example, this is VL5 and RL2 on the Attune NxT, this is Super Bright 702 and Alexa 700. These emit in the same channel off of different lasers – I’m sorry, they have a similar excitation spectra off two different lasers, so you can see where they might have an issue. This is just is a little bit of a warning to say that you’re going to have to be careful if you use these two together on the same cell. This is really helpful, you can do it easily with single stained cells, this is with CD4 Super Bright dyes and several different fluorophores on an Attune NxT. This is really helpful and very easy to do. You can do this for your machines, it doesn’t take long, just run the single stained, throw it in FlowJo, and here’s the button you click on for the spillover spreading matrix.

00:53:13 – 00:53:42, Slide 98

We’ll talk a little bit very quickly about staining control. This is in the literature and there’s a lot of great articles here. I won’t go too heavily on them but there’s some very important controls that you need to do when you’re doing multicolor flow cytometry and the number one control that I find people are not using as much as they probably should are fluorescent minus 1 controls. These controls are not compensation controls, they’re gating controls and then we’ll talk a little bit about the rest of the ones on the list here.

00:53:42 – 00:54:21, Slide 100

Fluorescence minus one controls contain all of the markers except the one of interest. If I want to do a FITC FMO control to determine what’s positive and negative for FITC, I want to use all the other fluorochromes in my panel except for FITC. They help us control for the effective spread, they allow delineation of positive from negative and they’re important for low density or smeared populations.

00:54:21 – 00:55:40, Slide 107

Here’s a couple of examples of how a fluorescence minus one control is used in gating for flow cytometry. You can see on the left in both cases, we set a gate on the unstained. On the right, you can certainly see that that gate is not appropriate for the full stain and you want to move the gate up but you certainly can’t convince a reviewer that Carol said I could move the gate up. Unfortunately, that doesn’t get a paper into a journal. I wish it did but if you run the FMO, you can see that that gate definitely defines the population and describes that spillover spread that we see there. Adding gates is not arbitrary and I can certainly not set gates any better with 30 years of experience than you can. I use the proper controls as well. I don’t want to rely on the unstained and certainly people ask what – how many FMOs do I need to run for every experiment. What I do is if I’m doing a 14-color experiment, I’m going to do 14 FMOs with that first experiment.

00:55:40 - 00:56:56

Here for example, in the first panel of plots that you see, the PE/CY7 and the CD45 PerCP-CY5.5, I would not need to use those two FMOs after the first time I run them because no matter where I set those gates, I can get a good read on the separation between those two populations. Once I’ve set the FMO and I realize where I need to set it, I can tell a reviewer, “I ran these FMOs the first time, I got good separation, and I didn’t have to run them the next time.” In the bottom two panels, I probably would not need to use the CD8 FMO but I would certainly need to use the FMO for PE because there, the antigen is expressed on a continuum and I would really need to set that FMO every time because moving that gate very slightly in either direction is going to change the value significantly. Here again, also I probably wouldn’t even use quadrants for this gate. I would probably set them as squares because quadrants really don’t describe flow data very well.

00:56:56 – 00:58:09, Slide 108

Here again, you have controls based on what you’re going to do to your cells. If you make changes or if you treat cells with radiation or you stimulate them or you kill them or there’s something that you do to your cells that create a change in your cells, there may be changes that affect your fluorescence, that affect your scatter, that affect the stickiness. You want to do as many controls as you can especially in the beginning of an experiment so that you can see any of these changes that might occur. Isotype controls are very controversial in flow, they’re not often used anymore, there’s a tremendous amount of literature out there on them. If you have to use an Isotype control, you want to throw it into your FMO at the titration that you use, your antibody and really the only thing that an Isotype shows you is that you have a titration issue. It shows you that you have a non-specific binding problem. It’s really something that you might want to consider only using in your FMO control.

00:58:09 – 00:58:45, Slide 109

Viability is very important to eliminate unwanted cell. Dead cells are sticky. It’s very important to use a viability control in every sample. Dump channel, not necessary but often very helpful. You can eliminate unwanted cells from analysis. This is being very helpful in things like stem cell analysis. Stem cells are very rare. We want to make sure we look at undifferentiated cells.

00:58:45 - 00:58:48, Slide 110

You can add differentiation markers all in the same channel, you can use them in the same channel as your viability dye and gate them out.

00:58:48 – 00:59:29, Slide 111

Here’s a slide that shows the amine-reactive dyes for dead cell identification. We have eight different options at Thermo Fisher. These are great. We used to use EMA which required ultraviolet light fixations and when these came out, these were fantastic. Live cells have some amine, dead cells have many more. So you can take these dyes, slot them into a place in your panel where you don’t have another re-agent, gate them out, we have AbC amine reactive compensation beads so they’re really fantastic for use in your panel.

00:59:29 – 00:59:49, Slide 112

We also have a number of impermeant nucleic acid dyes for dead cell identification. These are very bright, coming in a number of colors that you can also slot into your panel. These are used for live cells, the others can be also – the amine reactive dyes can also be used in live cells or in fixed cells.

00:59:49 – 01:00:49, Slide 113

Here’s an example of the importance of using a viability indicator. Forward and scatter, forward and side scatter we know are not to be used for dead cell indication because they are not a good indicator of viability. If you gate here on the upper left panel and you see a forward and side scattered gate, this is using log side scatter as the Perfetto and Herzenberg and Roederer lab often do. You can see here that the live cells and dead cells that are determined by scatter if you gate them separately, you’ll get a very, very different phenotype and you’ll still have dead cells in the final analysis that look like CD107 positive cells. It’s very important that you gate them out in the beginning using a dead cell indicator.

01:00:49 – 01:01:15-, Slide 115-120

Here’s a slide just looking at the differences that you’ll get when you exclude dead cells, when you use a dump channel and when you exclude doublets. If you’re looking at rare cells or if you’re looking at any cell, it’s really important to exclude the cells that you need to exclude in flow cytometry.

01:01:15 - 01:02:02, Slide 121

In summary, know your instrument, know the configuration, know what your performance test that’s determined by your manufacturer, know what the results give you and how your instruments are performing, do a voltage optimization, know your fluorophores, know their brightness, look at a stain index, look at your compensation and your spillover spreading. Titrate, very important. Then when you put the panel together, look at the spillover spreading, the antigen density and then always, always be a control freak. Use your controls for gating.

01:02:02 – 01:02:58, Slide 122

I’ll leave you with a slide with lots of educational resources. I think it’s very important to have mentors and to have web resources. Believe it or not, I did most of my flow before there was a web, but now we have tremendous resources on the web, we have guided learnings, you can use our School of Fluorescence, LabRoots of course, we have archived webinars that you can use. Some are very basic some are very advanced, we have an Invitrogen Flow Cytometry Panel Builder that you can look at what re-agents to choose for your panels, learning centers, publications. Friends don’t let friends do flow cytometry alone. I appreciate your attention. Thank you very much and if you have any questions, please let me know.

01:02:58 - 01:03:15

Moderator: Thank you Ms. Oxford for your informative presentation. We will now start the Q&A portion of the webinar and if you have a question you’d like to ask, please do so now. Just click on the “ask a question” box located on the far left of your screen. We’ll answer as many questions as we have time for. (01:03:15)

01:03:15 - 01:03:21

Question 1: Our first question is, you mentioned there are a variety of methods for optimization. What are the other methods used and do you have a recommendation?

01:03:21 - 01:04:52

Yes. That’s a good question. We’ve been going back and forth about those methods for probably 20 years now. We just did a study at Thermo Fisher and published a poster about that and I can certainly e-mail the participants that would like that. There are probably also talks about this on the ISAC website, the International Society for the Advancement of Cytometry. My favorite method is the easy one but of course I want to get good data, too. I think the peak-2 method is the one I like, it uses a single bead looking at the standard deviation and the CV but there are a lot of methods out there. There are methods that are published by the group in Mario Roederer’s lab. If you want to take a deep dive certainly - Simon Monard did some talks on this and has published this as well, M-O-N-A-R-D. There’s a lot of data out there but I think my personal favorite is the peak-2 method. The antibody capture bead method is good as well, it just takes a little bit more number crunching.

01:04:52 - 01:06:28

Question 2: What is the difference between compensation spillover and spillover spreading?

That’s a very good question. I really don’t like the terms, spillover, sort of makes you think that there’s so much fluorescence it spills over from one channel into the next and it’s really not. It’s like I spoke about with FITC, you’re really detecting the photons of one fluorochrome in the neighboring detector. Spillover is exactly that you’re detecting the photons of one fluorochrome in an adjacent PMT or it doesn’t even have to be adjacent, you’re just detecting the photons of one fluorophore in a detector where you’re going to make another measurement. Spreading is caused by photon counting errors. That’s just caused because of the nature of photons, if they have low energy, teenagers, then they have a lot of spreading, it’s just physics and compensation helps us visualize that spreading but it doesn’t fix it. Until we get new PMT’s that are better at detecting these photons, there’s just no way around it other than good panel design.

01:06:28 - 01:08:23

Question 3: We can have time for one more question. I don’t have FlowJo. Is there any other way to generate a spreading matrix?

Sure. Yes. You can actually use the papers. There’s two papers actually. The one that I gave in the lecture is certainly available. The math in that paper is a little more complicated than I like to do by hand. There is also a paper – I’m trying to remember it off the top of my head. It’s also a Mario Roederer and a Marty Bigos paper. I think it’s called “Unraveling the immune system 17 color flow cytometry” and it has a bit of an easier math equation that looks at the spreading and you can use that equation to generate a matrix by hand. We certainly did that before FlowJo was able to do the math. Yes, you can do a matrix or you can even take the matrix in either one of those papers and use it as a general guide. Certainly, the value of those numbers will change instrument to instrument but the relationship between the fluorophores is not going to change. So you can look at the ones that are an issue on any machine and they’re going to be an issue on every machine. They may be more or less of an issue but they’re still going be once that you’re going to want to know are going to be a problem and design your panel to begin with knowing that that is going to be an issue.


Moderator: I like to, once again, thank Ms. Oxford for her presentation and before we go, I would like to thank the audience for joining us today and for their interesting questions. Questions we did not have time for today will be addressed by the speaker via the contact information you provided at the time of registration. We would like to thank LabRoots and our sponsor Thermo Fisher Scientific for underwriting today’s educational webcast. This webcast can be viewed on demand through April 2019. LabRoots will alert you via e-mail when it’s available for replay. We encourage you to share that e-mail with your colleagues that may have missed today’s event. Until next time. Goodbye.

How recent technological advances in flow cytometry instrumentation are enabling faster throughput

This webinar presented by J. Paul Robinson at Purdue University, will focus on advances in the Invitrogen Attune NxT Flow Cytometer and outline where the technology within the Attune NxT fits in current and future applications.

00:00:00 - 00:01:27, Slide 1

Hello, everyone. Welcome to today’s live broadcast, “What are the Current Advances in Flow Cytometry,” presented by J. Paul Robinson, PhD. I’m Judy O’Rourke with LabRoots, and I’ll be your moderator for today’s event. We’re excited to bring you this educational web seminar presented by LabRoots and sponsored by Thermo Fisher Scientific.

Before we begin, I’d like to remind everyone that this presentation is interactive. We want to hear from you. Questions, comments and even answers can be submitted via the green Q&A button at the lower left of your screen. We’ll try to get to everyone, but if not, we will make sure to follow up with you by email. You can also enlarge the slide window by clicking on the screen icon in the lower right-hand corner of the slide window. If you cannot hear or see this presentation properly, let us know by clicking on the Support button at the top right or the Q&A button in the lower left.

I now present today’s speaker, J. Paul Robinson, PhD, The SVM Professor of Cytomics in the College of Veterinary Medicine and a Professor of Biomedical Engineering in the Weldon School of Biomedical Engineering at Purdue University. I will now turn the presentation over to Dr. Robinson.

Well, thank you, Judy. Welcome to all of you to this discussion on advances in flow cytometry. 

00:01:27 - 00:02:25, Slide 2

What I want to do before I get into the details of the webinar is give a small background on flow cytometry. I realized that a lot of you will be probably experts in the field, but there’s always some people who are not, and I don’t want to go past those individuals.

So flow cytometry has been around for a long time, over 50 years. It started out with just one or two parameters. That’s important for today’s discussion because it’s now moved to a large number of parameters. From 10 to 25 parameters might be a possibility and I think will increase over the years. Now most flow cytometry is going to be around the 5 to 15 parameters, and that’s the general area that I’m going to talk about today. 

00:02:25 - 00:03:44, Slide 3

Flow cytometry, as most of you know, measures single cells. It’s the most unique technology for measuring single cells. The reason is it can evaluate properties of each population of a mixed population, without physically separating those cells. That’s a pretty powerful capacity for a technology that can look at single cells. While it does that, it creates a strong basis for statistical analysis. That means that we can actually quantitate information in each of those populations that we might be going to measure.

Now the things that you can measure are either extracellular receptors, typically immunophenotyping, or you can measure intracellular components. Look at DNA or other sub-cellular organelles or molecules produced in the cell or on the cell so there is a lot of things that you can do with this technology.

00:03:44 - 00:06:54, Slide 4

Now no technology is perfect. There are limitations in everything. I want to just mention six limitations because I’ll be discussing most of these during the webinar with respect to the Attune {NxT Flow Cytometer]. First of all, sensitivity. The system has to have detectors capable of measuring the number of molecules of the marker that you have attached a fluorochrome to. Sensitivity of our background becomes important.

Concentration, sample concentration and flow rate. Those can be really complicated issues in flow cytometry. They will get you one day or the other that your sample is too concentrated or not concentrated enough, and that will cause you some problem. We’ve all faced that issue over the years.

Sample volume. What happens if you just simply don’t have very much volume? What do you do? That’s an interesting question because when we talk about flow cytometry, I can guarantee you that we always say, “Look, flow cytometry is fantastic because you can look at such small volumes.” This is true, except that if you only have a few microliters of sample, then it’s not so easy to actually run it on most flow cytometers.

Then the other issue is the number of samples per hour that you might have. We typically have run tubes over the years in flow cytometers. Today, we move to plates. We need to look at issues of what is the potential flow rate? What’s the sample rate? Not just the flow rate. How many samples can you run per hour? Not just how many samples can you run per hour but how many samples can you run per hour and get good data from?

Then number five there is the number of parameters and colors. I indicated that a lot of instruments are now coming on market that they’re in this range of 5 to 15 parameters, which seems to me to be a pretty popular range. That’s because we have really developed some good assays where we can get assays running with a fairly small number of cells and still be able to look at a vast number of parameters.

The last thing is a tricky issue, and that is looking at the size of particles, small or large particles. That’s a tough one for a lot of flow cytometers. Looking at very, very small particles is something that I’m seeing a lot of flow cytometry doing. I anticipate we’ll be looking more at that area. 

00:06:54 - 00:08:16, Slide 5

For the next 45 minutes or so, I’m going to discuss my experience with the Attune. The last several months, I’ve had the opportunity of playing with the Attune [NxT Flow Cytometer] in my lab. I have spent quite a bit of time with colleagues who have this instrument, and of course, my lab staff who have been using it. I thought that this would be a good opportunity to outline some of the features that I find attractive actually about the Attune [NxT Flow Cytometer] instrument. I’ll discuss some of the unique advantages of the technology because in every instrument, there is usually something that is really quite unique on that instrument. There are a couple of things on the Attune that I’d like to share with you.

I’ll try to expand on some of those issues because I have discussed this with a number of people and I’ve found that some people don’t quite understand exactly how the technology works. My goal will be to try to outline that in a fairly easy-to-understand way.  

00:08:16 - 00:08:56, Slide 6

Now I’d like to put this picture up. It’s a picture of Howard Shapiro. I think you probably all know him and you see his book, Practical Flow Cytometry there, which I highly recommend. If you haven’t got this book and you’re interested in doing any type of flow cytometry, you really owe it to yourself and your lab to get a copy of this book. It’s a tremendous book. I call it the bible of flow cytometry. There is one thing that’s quite unique about acoustic flow cytometry, and that is that it’s not mentioned in Howard Shapiro’s book. 

00:08:56 - 00:09:25, Slide 7

That brings me to the interesting discussion now about flow cytometry and how the Attune [NxT Flow Cytometer] works because it has this unique feature of having a different mechanism for aligning cells. I’m going to spend a little bit of time talking about that because it’s one of the most important features, I think, of this instrument. 

00:09:25 - 00:10:08, Slide 8

This is the Attune {NxT Flow Cytometer]. It’s an analyser; it can measure up to 14 colors and two scatter parameters. It’s an instrument that has the same general features of most instruments. It has a sample delivery component. It has a fluidic system. It has optics and electronics. Uniquely, it has the acoustic technology. (00:09:58)

00:10:08 - 00:11:19, Slide 9

Let’s discuss the acoustic technology. Now I’m not sure that too many of you will be familiar with The Journal of the Acoustical Society of America. Probably, most of you don’t care and will never look at this journal, but I want to show you some of the things that you miss if you haven’t looked at this journal.

Now if you were interested in the acoustics of Italian historical opera houses, you would have seen a wonderful paper in a recent journal issue of this journal. If that doesn’t tickle your fancy, how would you like to be able to measure the height of a speaker in a room simply from the acoustics? Now when you think about that, that’s pretty amazing. How high is the speaker standing at a lectern just from understanding the acoustic information?

Well, that’s the sort of paper that you might find in The Journal of Acoustic Society of America. 

00:11:19 - 00:11:50, Slide 10

It’s not the only paper. It turns out that a few years ago, Greg Kaduchak published a paper in this same journal. That paper was entitled, “Ultrasonic Particle Concentration in a Line-Driven Cylindrical Tube.” Naturally, all of you would have searched out this paper and reviewed it and read it and would have fully understood it. 

00:11:50 - 00:12:01, Slide 11

Well, perhaps not, but had you been looking for this sort of technology, then you would have learned about the beginning of acoustic flow cytometry. 

00:12:01 - 00:12:47, Slide 12

Let’s look a little bit at the paper that Greg published. Here is a picture of his tube that he made. On the side here, you’ll see there is what’s called a line source. What he did was he said, “Well, I understand what acoustic waves can do. What do they do if I would have put an acoustic device on the side of a tube?” Then he put this tube underneath a microscope. 

00:12:47 - 00:13:26, Slide 13

What he saw were some pictures and we’re explaining it to you here by this little animation. You see that when you put a speaker on the side of a tube, it builds standing waves. If there were dust in the tube, then that dust would end up being in the center of the tube. That was like an aha moment for Greg because he realized that the dust particles would be retained in the center cross section of this tube. What that did was gave him an idea. 

00:13:26 - 00:13:58, Slide 14

These are the pictures that he showed in his paper. On the left here, you see before the acoustic device here was turned on, and on the right, you see the particles pretty much aligned in the center. This was the start of an idea.

Now you don’t transform these ideas into technologies without looking at the theory. 

00:13:58 - 00:14:12, Slide 15

Just in case any of you would like to double-check Greg’s equations, I’d put them up here. I’m sure you’ll agree with me that they look great and we’ll just move on. The bottom line is that he tested the theory of this and then built a prototype.

00:14:12 - 00:15:01, Slide 16

Now if you go back a few years when he built this first prototype of the acoustic system, you’ll see here it looks, well, pretty much like a prototype, something that probably end up things that I build in my lab.

Interestingly, his first thoughts on this, he brought to people at Los Alamos, and he showed them this idea and some of the people at Los Alamos said, “You could do this – you could build a flow cytometer with this.” The original idea was maybe you could do this without having sheath fluid. That was the original concept of building a sheathless flow cytometer. 

00:15:01 - 00:16:50, Slide 17

As you develop technologies, as I will explain to you in a moment, things don’t always work out the way that you start. Here are some more material published by Greg. I want to show you what the effect of this transfuser over here is. 

On this side [left], there is a cartoon and on here is a real image. This is actually originally a video, but I couldn’t show the video on this system. I broke up the video into some small sections. I’ll show you, first of all, there we have as the system starts. 

This is before acoustic focusing is turned on. Now, what we’re going to do is turn on the acoustic focusing, so turn it on just a little bit.

You see the cells starting to move to the center. We turn it on a little bit more. The cells are further in the center.) 

You can see over here on this side [right] that they’re really starting to align quite nicely. Now we’ve got the acoustic system on full bore. Look at this alignment. They’re just perfectly aligned. Over here, the real image, you see that the cells are in the center of the core. Here, of course, is the acoustic waves as they go through the system.

From these early experiments, Greg was able to show, wow, this really will work on a flow cytometer. 

00:16:50 - 00:19:38, Slide 18

There are some advantages to this. That’s really the key point of this webinar is to explain some of the key advantages of acoustic flow cytometry. Now you’re all, I think, familiar with hydrodynamic focusing. This is what we do in almost every machine. We have a sheath fluid that goes through here. We have a tube here, a sample delivery tube. The cells in here are sort of all over the place. What we do is we focus them down through this – what Howard Shapiro called this neck down region here. They focused down and we have pretty much single cells. If you increase the pressure on this to go faster with traditional hydrodynamic focusing, you don’t end up with a good situation. We’ve all seen that in our systems that you lose some of the integrity or the quality of the data.

What’s the difference in an acoustic system? I want to make a pretty important point at this stage. The current version of the acoustic flow cytometer, in fact, the commercial instrument that was built, wasn’t exactly the same as the original concept that Greg developed. He developed a concept that was basically sheathless - it sounded like a good idea at the time. Practically, it is not quite as good as a sheath-based system. What Greg discovered is when you put the two systems together, that is you have both the hydrodynamic focusing and the acoustic focusing. You get the best of both worlds.

Now you probably can get away without hydrodynamic focusing, which is true because you do this in many of the commercial instruments that are out there, which don’t have this technology. That’s fine when you’re going reasonably slow. The powerful advantage of the combination of acoustic focusing and hydrodynamic focusing is that if you decide to move this system at high speed, then this is where the real advantage is. I’m going to spend some time talking about that.

00:19:38 - 00:20:18, Slide 19

Let me reiterate a couple of things by showing you some really old slides. These are slides that have been in my class deck for over 20 years. I try to explain with this slide the concept of hydrodynamic focusing of the core here and cells coming down the core and the fact that we have a laser coming through here somewhere and at about 90 degrees, we measure the fluorescent signal. This is all very well, but in fact, in most analysers, we don’t do this. We actually do this. 

00:20:18 - 00:21:01, Slide 20

We actually turn the system upside down. We flow upright. The only time we really flow down is when we’re sorting. Little bit difficult to sort going up because, well, it would be rather crazy, but for analysis, it’s far better to flow up. Just wanted to refocus us on how this concept works because this helps us see the difference between a traditional instrument, which has hydrodynamic focusing. 

00:21:01 - 00:21:15, Slide 21

And an instrument that has both hydrodynamic focusing and this acoustic module on the side. 

00:21:15 - 00:21:41, Slide 22

The advantages here are that by the time the cells reach the hydrodynamic focusing area, they are already in line. This allows you to increase the flow rate significantly without losing integrity of coefficient of variation, which as we all know, is important.

00:21:41 - 00:22:25, Slide 23

Let me summarize a little bit here for what we’ve been discussing. Why does this matter? I always like to ask myself that question. Why does it matter? Why do I care? So the instrument gives best of both worlds because we know that hydrodynamic focusing works well in most instruments that are out there. When you add acoustic focusing, you get a new dimension. You’ll see that in a little while when I show you some different data. The highly stable flow at high speeds is quite difficult to achieve on most instruments, but that’s definitely one of the advantages that we see in the Attune [NxT Flow Cytometer].

00:22:25 - 00:22:44, Slide 24

Let’s move on and discuss some of the other features of the instrument because there are some things that I quite like about the instrument and that I think makes it a very attractive instrument. Let me move to the optics and electronics.

00:22:44 - 00:23:50 , Slide 25

If we take the cover off, which I suspect you’re not supposed to do, this is what you will see. On the left there – let me block out some of this so that you just focus on some parts of the instrument.

That’s a lot of stuff here. We have some preamplifiers. You see a bunch of preamplifiers on this instrument. The photomultiplier tubes are inside there somewhere. Then we have a bunch of dichroic filters that you can see here and a stack of bandpass filters. This is the core optical assembly that you have on the instrument. You see here there is a fiber optic coming from some way or other, and that is addressing the detectors from the detector system. 

00:23:50 - 00:25:10, Slide 26

Coming back to that original picture, this is now showing us where the lasers are, and we have the possibility of multiple lasers. The instrument that I have in my lab has four lasers, but I understand that you don’t have to have four lasers. You can have three, two or one.

The blue laser is generally in that position. The yellow or green laser is there. The red laser is over here. The violet laser, at least, the laser we call the violet laser, it’s a 405, sort of looks a little bit violet, is down in the front of the instrument. If you look at the part that those lasers follow, it’s pretty much like I have outlined here on this graph. You’ll see that the lasers are actually physically separated from one another by a physical distance so that the cell goes through each laser successively. There are some advantages to doing that, which I’ll discuss in a moment. 

00:25:10 - 00:25:28, Slide 27

Let’s, for a second, look at these lasers. They’re quite interesting. First thing that I noticed when I pull the lid off the machine is that the lasers have a slightly different structure to what I was used to. The laser itself is hidden underneath the instrument at the back, but the delivery comes here into this optical assembly here. You have this round system here that is an optical assembly. What are the lasers? 

00:25:28 - 00:26:30, Slide 28

They’re actually an over-coherent laser, which has been designed particularly for this instrument. There are some unique features about the laser. The first feature is that it’s fiber optic delivered into this big blob here. Question is what is in that blob? 

00:26:30 - 00:27:20, Slide 29

So we clear all these marks that I have made on here so we can look at this optical device. There is a number of features. The laser comes in here, and then there is a collimating lens that focuses the laser. Then there is this thing called a Powell lens. What that does, as I’ll explain in a moment, is it spreads the beam out a little instead of it being a perfect Gaussian beam that gives it a little bit of a wider spread. There’s a value in that that’s quite unique in the way that the Attune uses its laser to focus. Then there is a telescope and a doublet lens here. I’m actually going to jump this slide and come back to it. 

00:27:20 - 00:29:44, Slide 30

This is what the laser signal would look like if you were able to see this on your machine. Actually, the specification of the laser is such that it has to have this format of beam. We call this a top hat or the Attune [NxT Flow Cytometer] uses the term “flat top.” There is an advantage in this because as you know, we often broaden the laser out a little bit so that when a cell goes through the laser beam that it gets an equal amount of intensity. We don’t like having the intensity of the laser beam, which, of course, is the strongest right here, but it falls off exponentially as it goes down here. There is a huge difference in intensity at this point if the beam is shaped this way. That’s not the same problem if you have this top hat. It doesn’t really matter if the cell comes in at any of these places. It’s going to get the same amount of intensity.

Now you might not care about that, but in fact, the quality of your data will change significantly, depending on what you are running if you are misaligned a little bit. Now I’m sure you could still be misaligned on any instrument, but the bottom line is that you have a better chance of getting better data if you have this broad beam.

For example, in cell cycle analysis, which I’ll mention later, that’s actually critical. You need to be very, very careful. We’ll talk about why the Attune [NxT Flow Cytometer], I think, has some really nice advantages there. 

00:29:44 - 00:30:35, Slide 31

Now in talking about the laser and the laser quality and the signal quality, it’s something that people have talked about for years and years and I think is now becoming more common in instrument design that you get the Levey-Jennings reports, plots for every one of your detectors, and the Attune [NxT Flow Cytometer] is no different. It has that feature which is very, very important quality control. If you want to review whether or not your instrument has been performing, this is the way to do it. I strongly recommend that you take note of those plots if you have not been doing so. Something that’s very easy to see on the Attune, and I strongly recommend it. 

00:30:35 - 00:32:00, Slide 32

Why does it matter that we care about the quality of the laser, the quality of the signal, the reliability of the alignment, et cetera? Well, it matters a great deal because the goal of most flow cytometry is to get a good quality CV; the best that we can possibly get, and that’s what we should strive at. Now, I know it’s hard to get great CVs in phenotyping, but that doesn’t mean you shouldn’t strive to get high quality CVs – cell cycle it’s critical. That’s something that you should take note of. The stability of the signal intensity certainly makes it important for experiments performed over days or months. You want to compare results, you certainly want to know that the data are compatible from one day to another, or comparable which is probably a more accurate way of saying it. If you want to be sure that the data are the best, you should strive to calibrate your system, and make sure that you keep all of these things tuned appropriately. That’s of course a scientific goal that I think everybody has. 

00:32:00 - 00:32:16, Slide 33

Well, there are a couple of other things that I think are definitely worth discussing on the Attune [NxT Flow Cytometer], and one of them is the fluidics system. Interesting, going back here to another slide that I have had for many, many years. 

00:32:16 - 00:34:58, Slide 34

The only modification I’ve made in this slide in 20 years is I added the peristaltic section here because that’s been something that I’ve noted in the last several years in terms of delivery of samples. Traditionally, there were two types of systems. There was a positive pressure system, and there was a positive displacement syringe system. These have been around for decades, and you know that in a positive pressure system, the key to stable flow here is having an appropriate differential pressure between the sample pressure here, and the pressure on the sheath flow in here.  We know that to get nice sample flow here stable, sample flow, non-turbulent flow, we need to regulate the pressure here very carefully, and we get this very small differential, and that’s how we basically change the rate of flow of cells.

It’s very different in a positive displacement system. In a syringe system which is what the Attune [NxT Flow Cytometer] is, it’s quite different. First thing that we notice is that there’s a syringe here, and there’s a sample loop of some sort, and then there’s your sample. The goal here is to pull the sample out of your sample tube into some kind of a sample loop, and then you change your valve, and then you deliver that at a constant flow rate, and your sample’s going through the system. Now, the advantages of the syringe system as opposed to a positive pressure system is that syringe systems are innately quantitative. You can measure the absolute flow rate or the absolute number of cells because you know exactly the volume, you know exactly the length of time that it’s run, so that’s actually really important. Now, I haven’t changed this in many years, but this is really no longer true. The Attune [NxT Flow Cytometer] can go much, much faster than this. That’s actually a significant change in the Attune [NxT Flow Cytometer], and something that I’ll address again in a moment.

00:34:58 - 00:36:27, Slide 35

The concept of a syringe system allows you to do absolute counts. This is just the photograph again taken in the back of the instrument. I’m not sure that you’re supposed to do this, but unfortunately, when an instrument comes in my lab I tend to pull it to pieces to see how it works, and this gives me a very good understanding of how the instrument actually performs; can see the quality of the construction, and how the whole system is designed, how easy it is to repair. This is a photograph I took of the back of the system. Over here is the syringe. This is a syringe, and you put the sample into your sample tube, and that’s somewhere over there. It’s sucked into a sample volume, and you might wonder why there are all these parts of this system. This valve is a fairly complicated valve, and that valve allows you to pull up a sample into a sample loop. It has to have ways of then delivering a sample, of cleaning, et cetera, but this is on the side of the instrument. This is the part that actually delivers the sample to the instrument. The fluidics environment is over here. 

00:36:27 - 00:38:03, Slide 36

Now, something very interesting here about this system, and I mentioned earlier that the original idea of Greg’s was to build and design a system that was sheathless. Well, it didn’t turn out quite that way. It isn’t sheathless, it’s actually less sheath. That shows that this is the sheath tank, and it’s really very small. It’s tiny. This suggests that the instrument is capable of going for extended periods of time with a very small amount of sheath. That’s a nice feature about the system, and that whole sheath tank is right here on the front of the instrument. Now, while I’m looking at this slide – I haven’t discussed the automation yet, but if you had a plate reader on the side of the instrument, it would be sitting here. I’ll draw your attention to this chamber here. If you open that up, you’ll see that there’s a sheath tank here. This also is very, very small. 

00:38:03 - 00:38:24, Slide 37

These are about two inches, maybe 50, 60 millimeters in size, or two or three inches very small. From a sheath system, the instrument takes very small. And just another reminder of the fact that because you have this syringe, you have a volumetric delivery and absolute count. 

00:38:24 - 00:39:11, Slide 38

Again, let me ask the same question, “Why does it matter? Why do I care?” Well, absolute counts will actually save a lot of money. If you don’t have an instrument that does absolute counts, then you’ll have to buy expensive beads and calibrate those beads, so there’s an advantage I think in doing absolute counts in using a syringe system. The other thing is it’s a tremendously stable flow, and you have a highly controlled sample delivery. The other advantage I think is that you can go from slow to extremely fast. 

00:39:11 - 00:39:25, Slide 39

Let’s look at the actual sample delivery. This has got some interesting features that I noticed on this instrument too that I found that were quite attractive. 

00:39:25 - 00:41:33, Slide 40

This is the sample tube, and the system is pretty much the same as other instruments in that you can stick a tube on here. It’s a spring-loaded system, it pops up, and the sample collection tube is in the center of the tube, and does what most systems do. It just sucks the sample up into a loop, and you’re good to go.  I’ve shown you this picture for another reason, and that’s because you see a second tube coming here, and this of course is where your plates are.

What’s interesting about this system that I know in my lab we run a lot of plates on the plate reader, and one of the things the staff like to do is instead of using a well on the plate to set-up the instrument, they like to do it the more traditional way, is take some sample, put it in a tube, put the tube in the machine, and run that tube, set-up the voltages, get everything right, and then they’re happy that they can then switch over, put their plate in and go. The advantage is that hidden underneath here is a valve, and that valve has a link between this, and this, and the system, and by simply pressing a switch, you can switch from here, to here, and you can run the system. That’s actually a really, really nice feature. It’s a well-engineered system. A lot of thought has gone into the fact that yes, often the time, you’ll use a tube to calibrate a system and run a plate on your system. 

00:41:33 - 00:43:23, Slide 41

So a few features that I mentioned earlier: reasonably good, dimensionality in looking at small particles. It has the capacity to use the 405-nanometer laser for scatter, and of course, because you reduce the wavelengths of the laser to 405, you increase the resolution, so that’s a nice feature. But I really want to focus on this feature here. You’ll see not bad resolution here, but the real point that I want to make here is that when you have the acoustics system that you can run at really quite a high rate, even faster than that, but this is a very high rate of sample flow, and get quite excellent resolution.

The other thing is because as I noted before, the beams are separated, you can run a 561 laser on this instrument, and because of that, you have separated beam, so you can look at your FITC and your PE signal with different lasers, so you can avoid compensation. Well, that’s always a handy feature if you can do that, so I just wanted to point that out. Again, why do all these things that I’ve just discussed matter? 

00:43:23 - 00:44:19, Slide 42

Why do I care? Well, I like the mechanism for being able to switch from tube to plate without doing anything but pressing a button. I think that’s actually quite nice, and also, it’s very easily accessible. It’s open, so that putting a tube in there – the technicians in the lab like that very much, and you can flow moderately fast without loss of resolution. I might actually change this to very fast or extremely fast. Moderately is probably far too conservative a word there, but it definitely is very, very fast, and having on-board QC, et cetera will certainly help you. 

00:44:19 - 00:44:57, Slide 43

There are a couple more features that I’ll mention in the last few minutes of my presentation. One of these is automation already showing the fact that you can put microtiter plates in here, and run your microtiter plates directly into the system. It can be permanently hooked up, and in my lab, it is permanently hooked up, and we switch frequently from running tubes in here to running plates in here. That’s the first feature. 

00:44:57 - 00:45:37, Slide 44

A feature that I haven’t yet tested, but I have seen it in operation out in Eugene, Oregon at Thermo’s site is their automation system. What they’ve done is they have taken their Orbitor [RS] system which can grab a bunch of plates from here, and pop them in here, and then put them back, and my understanding that this can be done in a temperature-controlled environment, and in heated, cooled, whatever, and you can of course use an automated incubator here. This is a system I haven’t tried or tested, but I have seen it in operation, so I understand that this is the next thing. I’m sure that there are plenty of people out there that may well be interested in that as a feature. 

00:45:37 - 00:46:33, Slide 45

I will point out that of course, you can’t use the small container for a sample fluid. If you’re going to run vast numbers of plates, you’ve really got to have a bigger system. I put these two pictures there almost in comparison of size. This is a relative size, and this is quite a large container that sits on the floor, and that’s required for the high number of plates if you are going to run the fully-automated system. So that is there.  Again, I haven’t tested it, but I’ve seen it operating. 

00:46:33 - 00:48:21, Slide 46

Why does this matter? Well, I think if you are in an environment where you wanted to run a large number of plates, the one thing that’s for sure is that automation is a very good thing. Running plates, I’ve found over the last several years that I’ve been running a lot of microtiter plates on flow cytometers, and preparing a lot of plates using automated preparation systems that there is a reduction in error. Robots do exactly what you tell them to do. They are extremely reliable, they don’t get sick, they don’t charge overtime, they’ll work all night if you want them to, and they repeat things with precision. If you’re doing preparation of microtiter plates, and I suppose if you have one plate to prepare, then it’s probably not worthwhile. But if you have three or four plates to prepare for an assay, there’s no question in my mind that automation improves the quality of the data because the volume distribution is precise. It facilitates timing if you have to add timing in your assay, if you are doing Phosflow assays, or something like that, you’ve got to stop the reaction at a certain point, and it’s got to be precise, and you want the same every time, and robots are very good at doing that.

00:48:21 - 00:50:33, Slide 47

Okay. Let me now briefly mention cell cycle analysis. I’ve got just half a dozen more slides in the webinar. Cell cycle analysis is something that we’ve been doing in flow cytometry for literally 40 years. It was the tool of choice in early flow cytometry days. There was a day when people didn’t do much else but cell cycle analysis. One of the things that you know for sure if you do cell cycle analysis is that you cannot run samples quickly. It is the slowest, probably most boring type of flow cytometry because you’ve just got to run things slowly, and the reason is you need good CVs. If you don’t have good CVs, there’s no way you’re going to tell the difference between a very small change in nucleic acid because that’s what you’re measuring. So we know that it takes several minutes per sample, it’s very time-consuming, and therefore, it’s expensive. It’s not expensive for reagents . They’re not expensive. It’s the time, the technician time, and the lab time, and the flow cytometry time that will really kill you if you’re going to run cell cycle analysis.

Well, that model doesn’t work with the Attune [NxT Flow Cytometer]. In fact, it’s completely changed. I’m really quite amazed at this because, because of the acoustic focusing, it’s a total paradigm shift with the Attune. My suspicion is that people who aren’t doing cell cycle analysis simply because of the reasons that I’ve mentioned will do it if they have this instrument because it’s almost a pleasure to do cell cycle analysis. You can run those samples through at much higher speed and get high quality CVs, and if you do this on a plate – and this is something that I never thought I would see, but if you run an entire plate of cell cycle, well, there are several things that happen. 

00:50:33 - 00:52:42, Slide 48

First of all, the analysis is a cinch. One of the things that I’ve learned over the last several years of running lots of plates is that the software is not well-defined to look at plates. Why? I don’t know. Because it’s a perfect opportunity. You know where everything is, the formats are fixed, you’ve got 96-well, you have 384-well, et cetera. When we looked at cell cycle analysis, one of my staff looked at this and just wrote some software that analyzes cells. Why do you want to look at cell cycle analysis like this? I’ll tell you why. There’s a very, very simple reason. 

Let’s say you’re interested in seeing whether or not there’s a change. “Oops, I’ve got the wrong marker. I discovered that you can actually draw a straight line if you pick the right one,” but look at this. You can look to see whether or not there’s a change in your G0, G1, or G2. Now, of course, you are just looking at the cells, but what you are doing here is you’re saying to yourself, “Oh, which ones should I look at in more detail?” and you can choose the ones that you want to look at in more detail, or you can ignore some of the ones that nothing much happened. This is one reason. The other reason is sure, it’s all very well to look down in that way, but one of the things that my staff member did when we looked at this was he said, “Well, just click a button and rotate everything, so that they’re actually going in this direction, and now you can look at the changes in rows as well as columns.” Because that never occurred to me when I looked at this, but yes, it’s all very well to look down the column, but there are things called rows as well, so that’s just one thing. The Attune [NxT Flow Cytometer] is able to do cell cycle analysis in an entire plate. Again, something that I never thought anyone would ever do. 

00:52:42 - 00:53:21, Slide 49

Just to show you the impact of speed, here we have – well, slow, normal speed, very fast, what I called really fast, and something which I’ve termed “ridiculously fast”. I would never believe that anyone would run cell cycle analysis at this speed, but look at this, absolutely superb CVs at this rate. Can it be done and can you do really good work? 

00:53:21 - 00:54:03, Slide 50

Well, I’m going to show you something from my friend and colleague Professor Jake Jacobberger. This is really complicated stuff, and Jake showed some of these data at a recent meeting, and was very generous in sharing some of his slides with me which I understand are going to be published very shortly in this book that I’ve referenced here. Look at the detail and quality of the work that Jake did. Jake did all of these cell cycle analysis on an Attune [NxT Flow Cytometer], and running them on plates.

00:54:03 - 00:54:43, Slide 51

just absolutely gorgeous work, and great detail. This tells me that this is a transformational moment in cytometry that you can do cell cycle analysis at moderately or at high speed. Certainly, the sort of speed that you can do your cell cycle experiments in an entire 96-well plate, and run them automatically, and then have a very rich data set to analyze. This to me is a transformational thing in cytometry. 

00:54:43 - 00:55:50, Slide 52

So why does all this matter? I’m going to finish up with a summary in my next slide. Why does all this matter? Well, cell cycle is something that I have always hated doing, and not just me. The technicians in the lab don’t like doing it at all because it’s so slow, and everything has to be so perfect, and you’ve got to be so worried about blockages, and keeping the flow rate nice and stable. Actually, that’s not the case with the Attune [NxT Flow Cytometer]. You can get excellent CVs at really pumping those cells through the Attune [NxT Flow Cytometer] very fast, and I really am amazed at that. I wouldn’t have thought that was the case until I sat down and actually saw the data. That’s why I think it matters. 

00:55:50 - 00:58:23, Slide 53

Let me now finish my seminar, and summarize some of the things that I’ve said. I’ve really tried to in the last 55 minutes just give my personal impression of the instrument; how we use the instrument, what I like about it, what I think are important. The acoustic focusing feature, something that I heard about, and it’s been around for several years now, and I really had to sit down and think about, and actually have a talk with Greg Kaduchak just to find out exactly how did this work, and why is it so good, and I’m really convinced that it’s a really excellent piece of technology. The fact that it also is combined with a sample delivery system that can provide absolute counts, and that you can combine both of those things with hydrodynamic focusing means that you get excellent data. The laser lines are delivered with a top-hat that’s really finely tuned in a way that alignment is less going to affect your results. If you do phenotyping, and you want to do a lot of samples, you can look at the 14 colors with four lasers, and this separation of the four beams means you have less compensation issues to deal with because you can set your system up in a way that is easier to manage.

The ability to switch from tubes to plates, that’s a really clever engineering feature. The other thing that I didn’t spend any time on this seminar, but I’ll mention it briefly now is that you can run incredibly dilute samples. I heard a story that Greg told me that someone had dropped a sample on the floor, and there was almost nothing left in the sample, and Greg said to the scientist, “Look, just put some saline in there, and we’ll run the sample,” and they did, and they got great data out of it. The reason is you can run that volume through the system in a reasonable time, and by the combination of acoustic and hydrodynamic focusing, you’re going to get excellent quality data on very few cells.

Finally, you can run cell cycle on this system at a very fast rate and get good reduction in CV. I’ve already mentioned the automation with robotics, and that may well be of interest to a number of people. 

00:58:23 - 00:59:50, Slide 54

Let me finally acknowledge the assistance I got from Jolene Bradford at Thermo for her support, for our laboratory which I gratefully acknowledge, for Greg Kaduchak and Michael Ward from Thermo who put up with me for a few days to literally pull the Attune [NxT Flow Cytometer] to pieces, and let me see the inner workings so that I can understand how it works, and give me a better understanding of that instrument. I really appreciate that support. To Kathy Ragheb who’s a flow technician in my lab who’s running the Attune [NxT Flow Cytometer], and runs our data, and Jennie Sturgis, Valery Patsekin, and Bartek Rajwa who support my lab and gave me the assistance that I have to produce the data we publish. With that, I’m going to hand over to Judy. I thank you for your attention during this webinar.

Thank you for that presentation, Dr. Robinson. We want to get right to your questions and input, so here is a reminder as to how audience members can communicate with us. Questions can be submitted via the Q&A button at the lower-left. If we are unable to get to your question due to time constraints, Thermo Fisher Scientific will reach out to you via email.

00:59:50 - 01:00:42, Question 1

Our first question is: Why can you do cell cycle faster on the Attune NxT? I thought cell cycle was always run at very slow cell cycle analysis rates. 

Well, that’s a great question. Thank you. I’ve spent a bit more time on that, and I see that we’re already at 59 minutes here. The bottom line is because you can align with both hydrodynamic focusing and acoustic, you can guarantee that those cells are perfectly in-line, and that you can go faster – that when you go faster, you don’t just broaden the core by allowing the cells to move away. They can’t because they’re focused in the center by the acoustic system. That’s the real reason why cell cycle runs faster. It’s something that I haven’t thought about before. 

01:00:42 - 01:01:42, Question 2

Can you clarify that the Attune NxT have those acoustic focusing and the hydrodynamic focusing, and why are they both needed? 

Well, as I said before, the original concept that Greg had was that you really didn’t need anything else. You didn’t need hydrodynamic focusing because you can actually go with a sheathless system, but as he found out when he designed and built the instrument, it works better. The bottom line is that the combination of the two just works better. When you build systems, and test them, and you prototype them, you find out what works better, and that I suspect was developed through many prototypes. The bottom line is we know that hydrodynamic focusing works well, and if you combine the two, then you get a much better system. 

01:01:42 - Question 3

Thank you. Wonderful questions coming, and we have time for just one more. Why did you say that the Attune NxT can look at samples with very low cell counts? It seems to me this is a fundamental problem with most flow cytometers.

Yes. That is true, and you really get worried if you have very few cells in the sample. We’ve probably all been in the situation where we’ve got very few cells, but we want to get a sense of what the measurement is without losing our cells. The problem is that’s very difficult. On the Attune [NxT Flow Cytometer], the reason is that because you can acoustically focus it, each of the cells that you have, you can force those cells to remain in that perfect alignment, and you can have them coming one after the other at long distances, and still – in other words, you have a very dilute sample, but you can run that sample at a high volume rate. If you ended up with 2 µl of sample and you only had 50 cells, with most flow cytometers, you would be sitting there for a long time. You would just not be able to find those cells. With the Attune [NxT Flow Cytometer], you can run at a high volume; several hundred microliters per second, and still be sure that those cells are going to come through in focus. That’s the reason that you can run at a high volume of sample, and still focus on those few cells that are there.

01:03:33 - 01:04:08

I would like to once again thank Dr. Robinson for his presentation. Do you have any final comments?  No. I’ve enjoyed the opportunity to do the presentation. I’m sure that if I’ve gone a little over time, I apologize. I’m sure the questions are coming in, I’ll be more than happy to answer those either publicly or privately. The links to my website, and my YouTube, and Facebook are there, so that people can find me and track me down.

01:04:08 - 01:04:36

Thank you again to Dr. Robinson. I would also like to thank LabRoots for making today’s educational webcast possible. Today’s webcast will be available for On Demand until November 2017. Keep an eye out for an email from LabRoots alerting you when this webcast will be available for replay. We encourage you to forward that announcement to your colleagues who may have missed today’s live event. Thank you for joining us, and we hope to see you next time. Bye-bye.

Multiparameter Cell Cycle Analysis

Find out how to get better cell recovery and definition of cell cycle compartments. This webinar will focus on multiparametric cell cycle analysis of DNA and specific epitopes using a “washless” staining assay to minimize sample handling.

0:00:00 - 0:00:39, Slide 1

Moderator: Hello, everyone. Welcome to today’s live broadcast, Multiparameter Cell Cycle Analysis, presented by Dr. James Jacobberger, professor emeritus, and director of cytometry and microscopy core, Case Western Reserve University. I am Alexis Cross of LabRoots, and I’ll be your moderator for today’s event. Today’s webinar is part of the protein and cell analysis education series, brought to you by LabRoots and sponsored by Thermo Fisher Scientific. For more information on our sponsor, please visit thermofisher.com.

0:00:39 - 0:01:13

Now, let’s get started. Before we begin, I would like to remind everyone that this event is interactive. We encourage you to participate by submitting as many questions as you want at any time you want during the presentation. To do so, simply type them into the Ask a Question box, and click on the Send button. We’ll answer as many question as we have time for at the end of the presentation. If you have trouble seeing or hearing the presentation, click on the Support tab found at the top right of the presentation window, or report your problem by clicking on the Ask a Question box located on the far left of your screen.

0:01:13 - 0:01:32

This presentation is educational and thus offers continuing education credits. Please click on the Continuing Education Credits tab located at the top right of the presentation window and follow the process to obtain your credit. So, without further ado, Dr. Jacobberger, you may now begin your presentation.

0:01:40 - 0:03:06, Slide 2

Thank you, Alexis. Hello, everyone.
The term cell cycle analysis has come to define cytometric assays to count cells and cell cycle phases, compartments, or states. I’ll use these words throughout the talk interchangeably. This slide shows a single parameter histogram in the upper left for cell stain for DNA content, which gives us three phases, G1, S, and G2+M. A two-parameter plot in the lower left shows cells stained for DNA content plus a mitotic marker which then gives us four phases, G1, S, G2, and M. M is synonymous with the mitotic phase - stages of the cell cycle. G1 are cells with one genome. S are cells that are synthesizing a second genome. G2 and M are cells with two genomes.

On the right, a three-parameter plot shows cells stained for DNA content, a mitotic marker, and two cyclin proteins. These are proteins that oscillate periodically and regulate the cell cycle. These data create a multi-compartment model of contiguous obligate states that cells pass through during the cell cycle. That is multiparametric cell cycle analysis.

0:03:06 - 0:04:55, Slide 3

In general, DNA is the primary base marker. It provides a three-compartment subdivision ordered in time, G1, then S, then G2+M. Periodically expressed genes or activities subdivide these phases further with the goal to create a model with objectively defined compartments that are unambiguous, that is, the cells pass through each compartment unidirectionally once.

This slide shows two parameter histograms of data synthesized from the expression profiles of such periodic expression. The upper row contains expression that rises and plateaus within a single phase. The middle shows expressions that rise across phases. In both rows, cells divide at the plateau level, creating daughter cells that have half that level. In the third and fourth rows, expression rises and declines within a single phase, for example, the first, second, and fourth columns, or across phase boundaries, the third column. These are synthesized patterns based on ideas, but we and others have observed, most of these patterns when looking at a specific marker.

In many cases, the compartments derived from these patterns are not unambiguous. For example, mitotic markers show elevated expression and mark mitotic cells, but the pattern contains cells that are entering and exiting mitosis within the same data space, i.e. that compartment is ambiguous. To render this ambiguous compartment unambiguous, we need additional markers.

0:04:55 - 0:06:18, Slide 4

I now introduce a fixation entertaining protocol that is extensively published, so the specifics are unimportant, other than the lengthy time it takes to prepare cells prior to cytometry. Key elements are formaldehyde to fix the cells and prevent further enzymatic activity, followed by alcoholic denaturation, and permeabilization, or detergent permeabilization without denaturation. Different markers are optimized by variations in this basic protocol. Since markers are intercellular, optimized staining requires time, which is 30 to 90 minutes, depending on the desired data quality, with 90 minutes providing the highest signal-to-noise ratio. Further, after antibody incubation, three washes at 15 minutes each, plus centrifuge time are optimal.

In the next few slides, I’ll present for K562 cells, a human erythroid leukemia cell line, stained for DNA content, (phospho S10) Histone H3, which is a mitotic marker, and two mitotic cyclins, A2 and B1. We’ll walk through the resulting data analysis step by step to create a multi-compartment cell cycle analysis.

0:06:18 - 0:07:40, Slide 5

This slide illustrates the first analysis steps, which clean up the data. These include doublet discrimination, exclusion of any anomalies from perturbed flow, and a correction for signal drift. A region is set that includes singlet cells based on the shape of the primary DNA signal pulse. The first plot shows the signal peak on the Y-axis and the signal area on the X. Doublets, triplets, and so on have peak heights equal to single cells that areas equal to multiples of single cells. The middle row shows plots of DNA content versus time. For this run, there are no areas of perturbed flow, but if there were, regions would be set - used to exclude incorrectly measured data.

However, although here the effect is small, there is a continuous decrease in the signal over time. That can be corrected in the same way as compensation is applied. The quality, that is, the CV of the data, does not change over time, so the correction improves the CV of the corrected data compared to uncorrected data. We examine all parameters for this effect, and corrected the overall effect of that’s severe enough. This doesn’t always happen, and the causes behind it are complex and still under study.

0:07:40 - 0:08:30, Slide 6

Since we are going to move continuously through a multiparameter data space through bivariate windows, I’ve included this slide to illustrate the overall process. What is plotted isn’t important because in the following slides, we will go through the process step by step. What is important is the idea that we have moved through. In this case, five parameter data space, following the arrows in a manner that does not exclude any combination that would equal a cell cycle state, or at least that is the idea. In this example, we start at the box labeled Begin and end where the box labeled End, which is the beginning and end of the cell cycle.

0:08:30 - 0:09:43, Slide 7

The next step is to separate interphase from mitosis. That is shown in the panel on the left. Phospho-Histone H3 and many other heavily phosphorylated epitopes increase dramatically when cells enter mitosis. For this epitope and others, dephosphorylation occurs in late mitosis and continues through immediate early G1. Thus, we can segment all M by gating all 4C cells with elevated phospho-Histone H3.

In these same plots of phospho-Histone H3 versus DNA, we can capture the early G1 cells, shown by the arrow and the word “Newborn.” In these same plots of phospho-Histone H3 versus DNA, we can capture the early G1 cells. We can check that measurement by examining cyclin B1 versus Histone H3. Because of the decreased background of the small early G1 cells, we can separate early G1 from the rare late M cells that have partially dephosphorylated Histone H3. That’s shown in the right panel.

0:09:43 - 0:10:42, Slide 8:

The next steps are to isolate G1 proper, that is, minus early G1. Both G1 and G2 can be segmented using cyclin A2 versus DNA plots with M cells removed by Boolean logic. The remaining cells are in S. The critical G1S and SG2 boundaries can be checked by comparing the G1S and G2 phase fractions of the DNA distribution determined by Gaussian modeling of DNA content with programs such as ModFit to the frequencies of cells determined by color coding logic. Because there is a shape to the cyclin A2 versus DNA S phase component, it can be broken into early S, S1, and the remainder of S, S2. The plot on the far right shows the DNA distribution color coded for G1, S1, S2, and G2.

0:10:42 - 0:11:56, Slide 9:

Next, we segment G1 into two states, G1A and G1B. The importance of segmenting G1 into two states, an uncommitted G1A and committed G1B separates what used to be called the restriction point, which is a commitment checkpoint. The mitotic cyclones are repressed in the uncommitted state via activity of the anaphase promoting complex/cyclosome or the APC/C. As it is inactivated, cyclone B1 expression increases. The level is still low and expression is continuous, therefore, it is not a great marker, but at present, some information can be obtained without adding another marker. The left figure shows how we can use the G1S boundary. S1 is color coded maroon, defined previously, and the early G1 cells here are colored black to determine approximately where to place the region boundaries. The panel on the right shows the progression of the states G1A, G1B, S1, S2, and G2 in terms of cyclin B1 levels.

0:11:56 - 0:15:05, Slide 10:

We next turn to M phase. We define the earliest M state P1 as poor C-cells with maximum levels of cyclins and rising levels of phospho-Histone H3. That is shown in the upper left on the plot of pHH3 versus cyclin A2. The previously defined G2 cells in green define the critical boundary, and the upper boundary is placed at the cluster edge. Thereafter, gating only on M cells, we plot cyclin B1 versus cyclin A2, and the pattern shown in the upper right is segmented at cluster boundaries. In this plot, P1 and P2 are coincident, and P2 is defined by Boolean logic that gates - that excludes P1.

When the APC/C begins to activate, cyclin A2 is degraded, and this is captured as PM because cells in this state correlate with prometaphase. When cells have reached undetectable levels of cyclin A2, the mitotic checkpoint is entered with stable high levels of cyclin B1 and depleted cyclin A2. This state is labeled M because it normally correlates with metaphase. If cells are treated with a mitotic spindle inhibitor, for example, nocodazole, they arrest here and this state will become highly populated.

Next, cyclin B1 is degraded and this correlates with the onset of anaphase, but the rate of decay does not correlate well with the sequence anaphase I, anaphase II, and telophase. Since these states do not correlate well with morphologically defined stages, we label them LM for late mitosis. LM1 is defined by cells with less than maximum but more than minimum cyclin B1, that is, cells that are degrading cyclin B1. LM2 is defined by cells when cyclin B1 has been degraded to a minimum.

Within this state, there are additional states that can be defined by morphology. On a plot of DNA pulse peak versus the DNA integrated signal for phospho-Histone H3-positive cells, we observed two clusters in the transition. The reason for this is that if the cytometer is well-tuned, the late anaphase and telophase cells look like doublets. This is shown in the lower right panel. The cells with the lowest peak signal, that is, those enriched in telophase can be further subdivided into a group with maximum phospho-Histone H3, that is, LM2C, and those with lower levels, LM2D.

The figure on the lower left shows that cells divide from LM2C and LM2D, or at least that is the working hypothesis.

0:15:05 - 0:15:43, Slide 11:

Thus, in a 3D plot, we can visualize most of the 15 unambiguous states, 14 of which are sequentially traversed by unperturbed proliferating cells. An average cell of this population under the proliferative conditions defined by the environment at the time of fixation moves sequentially from G1MB to LM2C, then optionally move to G1MB or LM2D, then to G1MB. This provides a continuous backbone onto which other markers can be mapped.

0:15:43 - 0:16:45, Slide 12:

Now, I’d like to introduce the idea of adding information obtained from a different platform. The data are from a laser scanning cytometer. In this case, these are a different attached cell line, but in principle, the same samples of fixed and stained cells that were analyzed by the flow cytometer previously could be analyzed in this manner.

The plot at the upper left are cells stained for phospho-Histone H3 and DNA, and I have set a gate for mitotic cells. The plot on the lower left shows nuclear size on the Y-axis, and on the X-axis, the density or cyclin B1 in each cell. Region R10 equals the largest nucleus with the minimum cyclin B1 density. Region R15 equals the smallest 4C nucleus with maximum cyclin B1 density.

0:16:45 - 0:17:15, Slide 13

If we look at the R10 images, we observe that cyclin B1 is cytoplasmic and on center zones, but not in the nucleus. Many of the center zones are well separated. At the beginning of mitosis, cyclin B1 first accumulates on the center zone, which then begins to separate and each row end up opposite each other, and create the poles of the mitotic spindle.

0:17:15 - 0:17:34, Slide 14

If we look at the images in R13 to the right of R10, we can see that cyclin B1 has entered the nucleus for approximately 50% of the cells. In those cells in which cyclin B1 is still on the cytoplasm, there are two center zones, and most are well separated.

0:17:34 - 0:17:54, Slide 15

If we go to R15, the cells are almost all on metaphase. Cyclin B1 is now cytoplasmic again because the nuclear membrane has broken down. Although, the cyclin B1 density is still high because the cells have rounded up and cyclin B1 decorates the mitotic spindle.

0:17:54 - 0:19:59, Slide 16

Now, I’d like to take a step back and go over the underlying principle behind the complex bivariate data patterns we’ve been analyzing. This slide shows the expression of two parameters over time. The top row is parameter one and the middle row is parameter two. The data are event cells that vary on the Y-axis, that is, for every point in time, there are multiple cells that represent the expression level if all of the cells in the population were synchronous. The two parameters are correlated in time. When a proliferating population of cells are sampled, because they are asynchronous, all points along the timeline of expression are represented. Thus, if we plot parameter one versus parameter two, we get two-parameter histograms, as shown in the bottom row that look like a typical flow cytometry data.

I’ve color coded the clusters and transitioned bottom row and their corresponding segments in the two top rows. For example, the two stable minima at the beginning and the ends of the time sequence are ochre and brown. They are represented in the bottom bivariate plots in the first cluster in the lower left. The sharply rising expression of parameter one, cyan, is correlated with expression of parameter two at a stable minimum, also cyan. Therefore, the transitional cyan cells move, “to the right in the parameter one direction and remain fixed in the parameter two direction in the bottom of the histogram.” If the expression of one parameter shifted in time relative to the other parameter, the number of self-populating clusters in transition is affected, middle and right columns.

0:19:59 - 0:21:34, Slide 17

Now, going back to our data from the laser scanning cytometer. If we calculate the median fluorescence and plot it on the Y-axis, and plot relative cell cycle time calculated from the compartment frequencies on the X-axis, we can see the profiles underlying the parameters. Here, phospho-Histone H3 expression and cyclin B1 density or any other marker that we have included. If we have imaging data, then we can do the same thing for other information by showing the time-related cumulative change in frequencies.

For example, this plot shows the parameter expression profiles, and reveals the variation and entry or passage through the mitotic stages. It can be seen that accumulation of two center zones parallel prophase, and both correlate with the rise in phospho-Histone H3. Equally, cyclin B1 enters the nucleus rapidly in all of the cells over a very short period, and this is followed immediately by entry to prometaphase. The expression of cyclin B1 through this period is maximal and stable, but the density sharply rises at the end of prophase. Metaphase is more variable, but by the time most of the cells have entered, cyclin B1 density remains high. By the time anaphase begins, cyclin B1 density is falling. Thus, we can quantitatively correlate expression movement of molecules and morphological changes.

0:21:34 - 0:24:38, Slide 18

This is about where we and others have taken multiparametric cell cycle analysis. I’d like to make a few comments, and then I’ll present some of our recent work. Since I can imagine critics saying that, “This is all nice but what real value is this complicated analysis,” I’ll start on the right side of this slide first. The first practical application is in pharmacodynamics. Cell cycle regulators and other targets that affect the cell cycle are still rational approaches in cancer chemotherapy. This type of an assay might provide improved information in pharmacodynamics studies and subsequently as a possible therapeutic guide. We have built an analysis of DNMT1, a target of azacitidine or decitabine therapies using a reduced version of this approach.

Second, for any blood or bone marrow analysis, some subset of this approach could reduce the complexity of data by normalizing expression of other molecules across the cell cycle, possibly reducing the weather data by a factor of two and coincidentally providing high quality doublet discrimination. Finally, my good friend, David Hedley, believes that it is time to use this information to study the effects of drugs in the therapeutic studies of using xenografts. I believe he’s right.

To the left side, these are the things that I would do going forward. G1 can be compartmentalized just as we have done for mitosis. There are many candidate markers, the data will be a little more messy because G1 is more highly variable, but I expect it to break apart in a similar fashion. G2 has a major DNA damage checkpoint operating. This has been extensively studied and the probes are available to see if G2 could be as interesting. I would like to do studies that incorporate data from multiple platforms into a single correlated dataset using the principles I’ve outlined, imaging is my favorite, but slit scan is underexplored, and the work of [George DuCane] and [Sergei Gulnik] demonstrates that differential permeable evasion may be a fruitful way to get at molecular sequestration.

I think the day has come for a new generation of probes, and camel antibodies are my favorite in that regard. Multiparametric data should be analyzed by probability state modeling. See Bruce Bagwell’s worked on this. This would eliminate or reduce significantly the number of decisions about where to set regions. Finally, I think improved hardware is not out of the question.

0:24:38 - 0:26:09, Slide 19

Speaking of improve hardware, I’d like to present some results that I’m very pleased with.

Here’s a reference to a paper by Goddard et al. on acoustic focusing cytometry. This is a technology where acoustic energy is directed into the flow cell in such a way that the center of the string is a volume where the waves cancel and there is an energy absence. Thus, the cells move to the center and once in the center, they stay there. This is illustrated in the next few images by an illustration on the left and a photograph on the right.

Thermo Fisher has commercialized this technology in the Attune NxT Flow Cytometer. What this meant to us is that we could try to develop a semi-washless protocol and reduce the time that we use to stain cells. The idea is to wash away the fixative normally, and then incubate in antibodies in a small volume, then add one large dilution. The Attune instrument can run a large volume so rapidly, up to 1 mil per minute. The instrument can do this because the cell wall move to the center and remain in focus. Thus, the unwashed large volume sample might be able to go directly on the instrument after an incubation period in a dilution.

0:26:09 - 0:26:50, Slide 20

Here is the protocol. Without going through the entire thing, it is just like our standard protocol except we save time on the washes and acquisition time with a time savings of two hours, 5 ½ hours compared to 3 ½ hours. Additionally, in these assays, each centrifugation results in a loss of cells. To the extent that starting with a million cells, usually provides significantly less than 100,000 analyzable events. We have not quantified it, but this washless assay does provide a noticeably better cell recovery. We’ll now look at some comparative data.

0:26:50 - 0:27:15, Slide 21

Here are the data comparing the two assays. They are visually equivalent. The top row compares phospho-Histone H3 versus cyclin A2. The middle row compares phospho-Histone H3 versus cyclin B1. The bottom rows compares cyclin B1 versus cyclin A2 from mitotic cells.

0:27:15 - 0:27:25, Slide 22

We quantified five mitotic states as these are the rarest events. The regions are shown in the bottom row.

0:27:25 - 0:27:50, Slide 23

Here is the result. There’s not a significant difference between the percentages of cells within each compartment or the levels of cyclin A2 and B1 in the cells. Therefore, this is a significant step forward for this kind of work. Since I have made the comments that I would ordinarily say for the summary, I’ll stop here.

0:27:50 - 0:28:15, Slide 24

One more thing, the work shown in this talk depended on the work of many people over many years from my own laboratory and those of many others. The references listed on this slide are reviews of the work or the extensive references to our work and that of others. Thank you for listening.

0:28:15 - 0:28:50, Slide 25

The people who did the work at Case Western Reserve University are Tammy Stefan and Phil Woost on the experimental side, Mike Sramkoski and Allison Kipling on the instrument side. From Thermo Fisher, Suzanne Schloemann ran our initial experiments, and Brian Wortham provided access to the instrument. Jolene Bradford and Mike Ward were critical for moving this project forward, and we continue to work with them on other projects.

0:28:50 - 0:29:17, Slide 26

Moderator: Thank you, Dr. Jacobberger, for your informative presentation. We will now start the live Q&A portion of the webinar. If you have a question you would like to ask, please do so now. Just type them into the Ask a Question box, and click on the Send button. We’ll answer as many questions as we have time for at the end of the presentation. Questions we do not have time for today will be answered by Dr. Jacobberger via email following the presentation.

0:29:17 - 0:30:35Question 1

Our first question is, presumably information increases with the addition of parameters, is the relationship simple? If so, what is the relationship between information and parameter number?

I’ve looked at this in some detail and the short answer is that there’s no simple relationship. I have the feeling that we hit on early on with parameters that were more informative per parameter than I expect to get per parameter in the future. For sure, an additional parameter should add at least one piece of information or it’s redundant. It’s probably not unthinkable that two or three pieces of information might be added per parameter. That’s about all I know about that.

0:30:35 - 0:31:15, Question 2

Our next question is, you have presented that analytical scheme of the backbone with the idea that this is definitely extensible. Have you tested this idea?

The short answer is no, I haven’t. There’s no reason that I can possibly think of that would prevent that from being true. So theoretically, it’s doable, but a real test of it is still – needs to be done.

0:31:15 - 0:33:55, Question 3

It looks like we have time for one more question. Can you give some examples of when a no-wash or minimal wash protocol will be effective?

Yes, I think that - as far as I’m concerned, it’s effective almost always because it saves time, but I think the, perhaps, more important case would be for things like clinical samples that are very limited in cell numbers. So the idea of them not losing cells through centrifugation comes to the forefront, and the idea of incubating with antibodies and then just diluting up and running, and being able to run almost the entire sample is very appealing especially for clinical samples.

We did a study here recently, that was published recently, on the sickle-cell disease and treatment of patients with low-dose decitabine to reactivate fetal hemoglobin. What we were looking for was the decrease in DNMT1 as a function of the decitabine dose. All of the samples had been done prior to us developing the assay, and we didn’t have any input into how the samples were prepared. Those samples were – I don’t remember the exact cell numbers, but it was very few cells, something like a couple of 100,000 in a relatively large volume of fixative. So precious samples that - where we really didn’t want to lose any cells. In that case, the no-wash approach is almost a lifesaver. It’s not that it can’t be done with washing, but I’m certainly much happier with the no-wash solution.

0:33:55 - 0:34:27, Question 4

Thank you again, Dr. Jacobberger. Do you have any final comments for our audience?

No, I hope you enjoyed the seminar – or the webinar. My email is, I think, in the credits or whatever, and I’m happy to answer any questions, if somebody has them, after the fact through email.

0:34:27 - 0:34:40

Moderator: I would like to once again thank Dr. Jacobberger for his presentation. I would also like to thank LabRoots and Thermo Fisher Scientific for making today’s education webcast possible. Questions we did not have time for today will be answered by Dr. Jacobberger via email following this presentation.

0:34:40 - 0:35:03

Before we go, I’d like to remind everyone that today’s webcast will be available for on-demand viewing through May 30th of 2018. You will receive an email from LabRoots alerting you when this webcast is available for replay. We encourage you to share that email with your colleagues who may have missed today’s live event. That’s all for now. Thank you for joining us, and we hope to see you again soon. Goodbye!

Making polychromatic flow cytometry easy after instrument characterization and verification

Join Grace Chojnowski, Flow Cytometry and Imaging Facility Manager at QIMR Berghofer Medical Research Institute as she discusses how instrument standardization can help simplify your flow cytometry.

0:00:0 - 00:00:14, Slide 1

Hello. We’re glad you’ve joined us in this live webinar, “Making Polychromatic Flow Cytometry Easy After Instrument Characterization and Validation”. I am Judy O’Rourke of LabRoots and I’ll be moderating this session.

0:00:14 - 0:00:57

Today’s educational web seminar is presented by LabRoots, the leading scientific social networking website and provider of virtual events and webinars advancing scientific collaboration and learning. It’s brought to you by Thermo Fisher Scientific. Thermo Fisher is a world leader in serving science whose mission is to enable customers to make the world healthier, cleaner, and safer. Thermo Fisher Scientific helps their customers accelerate life sciences research, solve complex analytical challenges, improve patient diagnostics and increase laboratory productivity. For more information, please visit www.thermoscientific.com.

0:00:57 - 0:01:51

Let’s get started. You can post questions to the speaker during the presentation while they freshen your mind. To do so, simply type into the dropdown box located on the far left of you screen labeled “Ask a question” and click on the Send button. Questions will be answered after the presentation. To enlarge the slide window, click on the arrows at the top right-hand corner of the presentation window. If you experience technical problems seeing or hearing the presentation, just click on the support tab found at the top right of the presentation window or report your problem by typing it into the “Answer a question” box located on the far left of your screen. This is an educational webinar and thus offers free continuing education credits. Please click on the “Continuing education credits” tab located at the top right of the presentation window and follow the process of obtaining your credits.

0:01:57 - 0:02:51

I now present today’s speaker, Grace Chojnowski, the Flow Cytometry and Imaging Facility Manager at the QIMR Berghofer Medical Institute in Brisbane, Australia. Grace has a strong instrumentation background and has been managing flow cytometry core facilities for more than 30 years. She came to QIMR Berghofer in 1993 having earlier worked as the flow cytometry facility manager at the Peter MacCallum Cancer Institute in Melbourne. She graduated from RMIT University with a Master’s in Applied Science. Grace is active with the Australasian Cytometry Society organizing courses and meetings and serves on the executive committee. She has served two terms as counselor for the active International Society for Advancement of Cytometry and continues to be active with the society. Grace’s complete bio is found on the LabRoots website. Grace Chojnowski will now begin her presentation.

0:02:51 - 0:03:15

Thank you very much, Judy, and thank you for your kind words and the opportunity to share the work we’ve done here at QIMR Berghofer. We’ve kept for instrument characterization and validations.

0:03:15 - 0:04:13, Slide 2

So, my role at QIMR Berghofer is to try and help to facilitate good cytometry to all our internal academic users, our external academic users and also helps some of our commercial clients and contract work as well as some of the clinical child work we do. We have users who have got a reasonably deep understanding of flow cytometry but then we also have those who become a little bit overwhelmed by the complexity of some of the instrumentation. We have users who prepare a panel and an analyzer then want to do some sorting using the same panel and cannot understand why their populations may look slightly different and why one instrument gives them better separation than others.

So, we embarked on this sort of experiments to see if we can add some of those questions.

0:04:13 - 0:04:41, Slide 4

So, what did we do? Beads are the standard use for flow cytometry performance checking. We know that these beads use different dyes to the ones that are commonly used when performing flow cytometry applications and assays. As such, we wanted to test our instruments response to the same fluorochrome that we use for our day-to-day assays and not just the dyes that are attached to the beads.

0:04:41 - 0:06:00

Another point is that the brightness or staining intakes that’s provided by vendors give us an indication of how bright some these fluorochromes may be, but that is not always the same on our instruments, on my instrument or your instrument. We saw 61 CD4 antibodies with 46 fluorochromes and a number of different clones. It would’ve been ideal to have had the same clone for all the different fluorochromes, but we weren’t able to obtain all 46 in the same clone. Initially, we started using normal healthy volunteer peripheral blood, but as you can imagine, we needed a lot of blood from one volunteer so we could avoid any variability from different multiple donors in one experiment. After a while, we decided to start using comp beads as well. The reason is that comp beads don’t change from day-to-day or individual-to-individual. The concentration is always constant and it’s a lot easier to use comp beads than large volumes of blood from different volunteers.

0:06:00 - 0:06:58, Slide 5

So, polychromatic flow cytometry has grown rapidly over the last five to 10 years. You know, many years ago we had the organic dyes like FITC and some of the phycoproteins like PE and APC, then along came some of the tandem dyes, two-dots and more recently the new Sirigen dyes. This has increased the availability of new dyes along with new instrumentations that’s able to acquire many more parameters simultaneously allowing us to measure so many more antigens simultaneously on one single cell. Having a good handle on instrument characteristics the optical configuration can have a big impact on the ability to resolve many different cell populations.

0:06:58 - 0:07:36, Slide 6

Many vendors that do sell reagents with the different fluorophones will give us an indication of their level of brightness and some vendors will rank these dyes as can be seen in this table, but this ranking maybe different on your instrument or my instrument either because of my optical configuration is different, or it can use different collection optics, my laser, excitation power or wavelength may be different as well.

0:07:36 - 0:07:50, Slide 7

Here we have some fluorochromes that are ranked in order of brightness but also grouped in either high brightness, medium or low.

0:07:50 - 0:08:05, Slide 8

Here, again, we’ve got the rank from very high to very low or also numerically from a five to a one low.

0:08:05 - 0:08:57, Slide 9

Here’s the list of the fluorochromes that we used in our instrument characterization as well as well as the CD4 clone that was associated with that fluorochrome. There were three main vendors that we obtained our antibodies from and I’ll name them A, B, and C. The CD4 Clone SK3 was the most common one used, followed by the RPA-T4, and then a couple of the S3.5 clones. You can also see that with some of the antibodies, the same clone with that with the same fluorochrome was available from a number of different vendors. For example, APC we could get for a number of the vendors.

0:08:57 - 0:10:06, Slide 10

This table shows us the optical configuration of the instruments at our institute which we used for these experiments and which we obtained our individual instrument characteristics on. You can see that some of the instruments have got identical optical configuration such as the Fortessa 1 and 2 are identical. The Fortessa 4 and 5 are also identical, so they have the same lasers, the same dichroics, the same collection filters whereas the other ones may have slight variations with some the bandpass filters and also dichroics. I haven’t listed their dichroic filters or the steering optics here but they may also differ slightly from instruments to instrument.

0:10:06 - 0:11:13, Slide 11

Okay. So antibody titer, all antibodies are tittered for both the normal healthy volunteer or the peripheral blood but we also did compare titers using the comp beads. We expressed the titer as protein in micrograms per mil. We wanted to make sure that the same amount of antibody is present from all the antibodies that we tested and the only difference was the fluorochrome that we were using. We wanted to also make sure that once you have a titer with your peripheral blood. That’s a lot easier to obtain, but with comp beads, we really wanted to make sure that the amount the amount of protein, because of all the binding sites on comp beads, we wanted to make sure that the protein between all the fluorochromes was just the same for the beads.

0:11:13 - 0:11:42, Slide 12

The staining index. What is the staining index and what does the staining index tell us? It lets us know how well-resolved a population is from the background or from the negative population. The brighter the signal above the background or the unstained in the sample, the higher the staining index.

0:11:42 - 0:13:01, Slide 13

For all the data required for each antibody from each titer, we calculated the staining index. You calculate the main fluorescence intensity or the MFI of the positive cell population, the MFI of the negative cell population, then calculate the staining standard deviation of the negative population, and then you give a formula where you subtract the negative MFI from the positive MFI and divide it by the two times the standard deviation of the negative population. As the negative population expands or the staining index the standard deviation will also increase which will then have an effect on the staining index. The staining index allow us to determine how well resolved our CD4 population is from the background or the negative population.

0:13:01 - 0:14:17, Slide 14

Here is an example of some of the staining index results we got and what we considered was the optimal titer. So, we did this for all the fluorochromes. I’ve also plotted the positive medium fluorescence and the negative medium fluorescence, and then in red we can see the staining index. You could also see that sometimes if a population - if you’ve got the highest mean fluorescence intensity on a positive population, that doesn’t necessarily mean that that higher titer or that concentration is going to give you the best staining index so sometimes you will get an increase in the negative which will have an effect on the total staining index or that final staining index number. It’s about getting the best separation between your negative cells and your positive cells.

0:14:17 - 0:15:13, Slide 15

Here is an example of some of the staining index results we got and what we considered was the optimal titer. So, we did this for all the fluorochromes. I’ve also plotted the positive medium fluorescence and the negative medium fluorescence, and then in red we can see the staining index. You could also see that sometimes if a population - if you’ve got the highest mean fluorescence intensity on a positive population, that doesn’t necessarily mean that that higher titer or that concentration is going to give you the best staining index so sometimes you will get an increase in the negative which will have an effect on the total staining index or that final staining index number. It’s about getting the best separation between your negative cells and your positive cells.

0:15:13 - 0:15:46, Slide 16

We also did some voltration experiments where we altered the voltage for each detector just to see what was the best voltage for the different fluorochromes and looked at the range where you do get a better signal between your negative and your positive cells.

0:15:46 - 0:17:00, Slide 17

The results. Here, we’ve got the response from one of our instruments, one of our Fortessas. It had five lasers and we’re able to configure the instrument to acquire data from all the fluorochromes we tested. So, we can see here that the two fluorochromes that give us the best staining index are from two different vendors. They’re excited by the same laser, collected from the same detector and also these two fluorochromes have got similar vector properties. What this is indicating is that the detectors that are picking up those fluorochromes are very sensitive to that spectrum and also those two fluorochromes are reasonably bright as well.

0:17:00 - 0:18:20, Slide 18

Here we’ve got four instruments. The two top instruments are identical in their optical configuration and the bottom two are also identical with their optical configuration. The two on the left are older instruments. The SH-04 and the F4A are older instruments and the two on the right have been purchased more recently. The two on the left were very popular. They’re usually quite booked out and they use a slight reconfiguration, they like to run their experiments, and they said, “Well, we need instruments that are exactly the same.” So, we purchased additional instrument with the same configuration to keep our users happy. We can see that the older instruments don’t have as good as response as the new ones. Later on in this presentation, I will try and address why this is, or why I think this is.

0:18:20 - 0:18:38, Slide 19

Here we have one of their cell sorters and you can see that the response to most of the fluorochrome is very good but the response to the fluorochromes excited by the 561 laser are very good.

0:18:38 - 0:19:10

Just before we perform these instrument performing characteristics. See, we had a few issues with the instruments, and the 561 laser died. So, we got a brand new laser, all the fibers, everything had been replaced and everything had been realigned which could be another reason why the yellow grain is giving us much superior signals to the rest fluorochromes.

0:19:10 - 0:20:00, Slide 20

This is the stream-in-air cell sorter. It’s out to MoFlo and, again, we’ve got an excellent response to the yellow green fluorochrome. One thing that we did do for this instrument when the new yellow - the new fluorescent protein dyes became popular, we purchased a reasonably high-powered laser to excite some of those fluorescent protein such as mCherry. So, the power from this laser is a lot, lot higher than the power on the other lasers on the instruments. It also can explain why we’re getting such a good response.

0:20:00 - 0:20:55, Slide 21

So, do the same fluorochromes have better response on our instruments from the different vendors? No. Not really. They’re all pretty much the same. We compared the same CD4 clone with the same fluorochrome from different vendors and found that they pretty much gave similar results on all our different instruments.

We’ve got the APC, FITC, PE-Cy7 and PE and all the instruments and the responses.

They all were where you’ve got a better response for one fluorochrome you also got a reasonably good response from the other vendors as well.

0:20:55 - 0:21:59, Slide 22

Comparisons between different fluorochromes but where you’re using the same detector. When you compare different fluorochromes from different vendors, there are differences. These could be due to the actual brightness of the dye but it could also be due to the specific collection optics of that instrument. You know, some filters maybe better suited for a particular fluorochrome than another and we can’t constantly be changing our collection filters to accommodate from one fluorochrome from one experiment to another. You can see that some of the emission spectra are reasonably close but they’re all not identical.

0:21:59 - 0:24:21, Slide 23

In an ideal flow cytometry world, our dyes would’ve made a narrow spectral band with no or very little spillover signal into different detectors. Reality is, fluorochromes emit into a whole lot of other detectors where you don’t want them to either to adjacent detectors but also to other detectors where you’ve got different excitation wide length. Sometimes the spillover can be quite significant and this will make multicolor or polychromatic cytometry and panel design difficult. When there’s spillover of this sort of fluorescents into multiple detectors, it will have an effect for a spring, the nominal negative cell distribution, and this can have a huge impact on the ability to resolve some populations especially those that have got a very low antigen expression. So, what we did after we obtained all these instrument characteristics, we used the feature in FlowJo that is able to calculate the spillover spreading matrix between the different fluorochrome combinations. Stuff like this makes it easy to recognize which fluorochrome combination should best be avoided as they do have a very, very high level of spillover into each other. If it cannot avoid using some of those combinations, you should try and assign the dim receptors to the bright dyes that do have a minimal spillover. In the table, we can see some examples where you thought our PE-Cy5 and a BB700. That value is quite high. If you had two dim populations on either of those fluorochromes, you may not be able to resolve that population very well.

0:24:21 - 0:26:22, Slide 24

So, here we have another table where we’re able to summarize all the spillover. You can see which fluorochromes are going to give you quite high spillover of that fluorochrome into the other detectors. You can actually calculate the total spread from each dye contribute. It kind of let you know whether that dye is going to work well or play well with some of the other dyes, which is very useful when you’re designing a new panel and also when you’re adding one or two extra fluorochromes or markers to an already prepared panel.

Down the bottom here, you can also calculate the total spread. Detectors that have got very low values will be most sensitive for detecting when you’ve got all the other colors used. This is very important when you’re trying to resolve very low dim population. Again, those dim markers try and put them on the brightest dye with the least spread to try and have some – it guarantees success.

0:26:22 - 0:27:51, Slide 25

I’m going to try and explain what I think some of the reasons are for the differences in staining index. Some of the things that may have an effect on the staining index and how well we’re able to resolve our cells of interest is that the laser power and one of our analyzers were able to change the laser power, and we ran some eight-peak rainbow beads just to see the effect of the laser power and the ability to resolve all eight peaks. As we increase the laser power, we’re able to resolve some of the dimmer bead populations. We can see that here we’ve collected two different – we’ve used two different collection filters for these beads and you can see that no matter how high you turn up the laser, some of the dim populations and notice were resolved in the higher wavelength, the 670-30 collection. That’s because the dye emission doesn’t peak around that area. It sort of peaks at the lower wavelength.

0:27:51 - 0:28:40, Slide 26

Here, we also looked at another analyzer where we’re again also able to change the laser power and the effect it had on dissolving eight-peak beads. The laser power is important. Maybe not so when you - for some of the bright expressing beads or bright expressing dyes but where you do have very low antigens or very dim populations, laser power can help improve the resolution.

0:28:40 - 0:29:02, Slide 27

Again, this is another representation of how the laser power does have an effect on dissolving the low bead populations.

0:29:02 - 0:31:14, Slide 28

Optical filters, band pass filters, dichroic filters, they’ve made up different layers in coating that allow you to collect the different wavelengths of interest. Earlier in the presentation I showed a couple of instruments which in theory should have been identical having the same optical layout that we saw that with some of the - we knew that some of the newer instruments gave us a better staining index than the older ones. One reason for this is that the band pass filters in the older instruments have degraded. So, environmental deterioration due to either moisture getting into some of the layers on the band pass filters will have a big impact on how well these filters behave. Brisbane’s in Australia or in the sub-tropics, our summers are hot and very, very humid. No matter how hard we try and control the humidity and take the temperatures at a constant level, there is still a lot of moisture in the air. I’m pretty sure that a lot of our vent band pass filters do degrade just because there is a lot of moisture in the air. Something that you can try out there is, if you take your filters out of your instrument if you don’t think they’re performing well, just put them up against some lot - you can actually start to see some degradation and when you do start to see that, it’s time to swap the filters. There are different vendors that do supply filters that as the quality of the filter goes up, so does the price.

0:31:14 - 0:32:56, Slide 29

In this slide, we can see the different effects of or the effect that filters can have on how well we’re able to resolve populations. Here, we’ve run some eight-peak rainbow beads and swapped only the band pass filter collecting the signal. Everything else was the same, the dichroics, the laser power and we used the same band pass filter of collecting 530/30 wavelengths. Some of the new filters may not be as good as one would think. Some of the older ones, you can see, are so bad that you can’t even resolve some of the higher peaks well. Band pass filters do play a big, big part in how well you’re able to - or how bright a signal you’re getting on your instrument. So, if your instruments are quite a few years old it’s really worth sitting down and checking. Have your filters degraded? Do they need to be replaced? If so, when you do replace them, you’ll find that there is a marked difference on how well your instrument will start to perform.

0:32:56 - 0:35:25, Slide 31

Photodetectors or photomultiplier tubes convert the photon of light being emitted from our fluorochromes and convert these into an electronic signal that we can then measure and display in flow cytometry. As the number of photons hitting the detector increases, the current, the resulting current also increases. Photomultiplier tubes have got different responses to different wavelengths. The manufacturers will select photomultiplier tubes for a range of wavelengths to be able to detect within the visible range. Some photomultiplier tubes have got much better responses to a particular wavelength than others and sometimes they may not be placed in the best, the most ideal spot for a very fit, most suitable wavelength. Years ago, with the instruments I had I used to remove all the photomultiplier tubes from an instrument and then test each one individually for different wavelength responses. I would then place them on my machine in positions that were best suited to collect that particular wavelength. The newer instruments these days make that a little bit more difficult. Everything is sort of bolted down and locked in the instrument. Some of the cell sorters still allow you to do that but most of the analyzers, it’s not so easy to do anymore.

Something else that people should be aware of if you do have very, very bright dyes where you got a very high signal going to that detector, at that higher detection rate, some of the photomultiplier tubes may not respond in a linear fashion anymore, and they do search a way for when the incident light is very, very high.

0:35:25 - 0:36:24, Slide 31

The Q, what is your Q? The Q value gives you an indication of how efficient the detector is able to convert that light signal or that fluorescent signal to an electronic signal or photo electron. You’ve got a cell that has 10 fluorescent molecules being emitted that hit the detector and the conversion is five photo electrons then the Q value would be we’ve got a 50% conversion. The average conversion in most flow cytometers, which is very good is around maybe 20-30%. Not every single fluorescent molecule gets converted into a photo electron.

0:36:24 - 0:37:51, Slide 32

What kind of things affects the Q value? As I showed earlier, laser power but also laser alignment. You may have a laser with really good power but if it’s not aligned properly and it’s not focused properly, you’re not going to get the most optimal signal. You really need to make sure that you look after your instrument. You’ve got to have a clean flow cell or nozzle. Make sure all your stirring, all your objectives are clean, the lenses are clean. If you’ve got mutual density filters in front of some of your detectors, that will also decrease the Q value. As I said before, PMT sensitivity. Some detectors give a better response to some wavelengths and some less sensitive especially in some of the red wavelengths, and also how well does that PMT response to different wavelengths.

0:37:51 - 0:39:18, Slide 33

Linking dye brighteners and fluorochrome response on individual flow cytometers, they are different. These P&Ts have got different responses to different wavelengths. Your cytometer model, how - what it’s optical configuration is. How much noise you actually have in your detector will play a part as well. Do you have good clean power or is your lab reasonably noisy? All these things will have a part. We know that some dyes are brighter than others that because of the variation in instruments, some instrument the collection optics you will get a different response. Also, something that is important is how well you look after your instrument and how clean the optics are. If things do start to degrade, it really is a good idea to replace some of your band pass filters.

0:39:18 - 0:40:38, Slide 34

What can we do to improve how we select different markers associated with fluorochromes when designing our panels? Ranking fluorochromes for instruments, which is what we’ve done is a good start. What we’ve done here on one of our Fortessas is that we’ve grouped the different fluorochromes into very high, high, moderate, low to very low. The actual base groups of high or low fluorochrome brighteners are going to be different for each instrument. The fluorochrome is suddenly going to be very high on one instrument, may not be the same on another instrument. So, people need to be aware of that if they’re going to be staining a certain panel using this design or one of the instruments. They may not get as good a result on some of the other instruments.

0:40:38 - 0:42:40, Slide 35

Things to consider when you are going to be designing multi-color panels, you really need to determine, define how good your dye brightness is on an instrument. Try and avoid any fluorochromes that do spillover. The less overlap there is, the better your resolution. Another suggestion is, if some of the dyes are overlapping then try and use markets for those of overlapping fluorochromes that don’t or express something like T and B cells. Try and choose bright dyes, some of the low expression markers especially the fluorochromes that don’t have a lot of spillover. Understand the biology of your antigen or what you’re trying to – what the question is you’re asking with your experiment. Define the dye brighteners for each analyzer and then also consider the spreading error of that dye on the instrument that you’re going to be using.

Also something you need to take into consideration is some of the markers you’re looking at transient. Do something biological have to happen so that marker should be expressed? In those cases, really try and use fluorochromes that are bright or have very minimal spillover. Know your flow cytometer.

0:42:40 - 0:44:14, Slide 36

What do we need to do? Well we continually update and educate our facility users about dye complexity and panel design. We emphasize the individual optical characteristic within the facility, and we’ll sit down with our users and help tailor their panel to the instrument of their choice. Or we’ve ranked all our dyes and use some of the features and software that will allow you to calculate the spread of your fluorochromes for that particular cytometer, and provide this as a really good resource of facility users to help guide them, prepare the panel design and some of the more complex panels that we’ve been doing for the clinical trials that we do here. We aim to give the best advice and service because we are a service facility to our users and we will modify our instruments as new dyes come on the market doing help to further improve the service to our users.

0:44:14 - 0:44:52, Slide 37

I just wanted to thank the rest of the gang here. It was a pretty major day or days, multiple days when we do this work. Every single person had an instrument and they basically almost spent the whole day on each instrument running all the different meters and all the different fluorochromes just to get all these started together. I guess it’s for everyone and thank you. Any questions?

0:44:52 - 0:45:14 

Thank you, Grace for your presentation. A quick reminder for our audience on how to submit questions. Simply type them into the dropdown box located on the far left of your presentation window labeled “Ask a Question” and click on the “Send” button. Grace Chojnowski will answer as many questions as time permits.

0:45:14 - 0:46:10

First question is: How important is titration of each antibody conjugate used?

It’s very important. As we saw with the staining index, having the highest titer doesn’t always give you the best separation of the cells of interest. You may actually start to increase the background a little bit more if you do have too high a concentration. Then again, if you don’t have enough, if the concentration of your antibody is too low, you’re not going to get a very good signal either. So it is crucial that you do titer your antibody before you start doing any experiment.

0:46:10 - 0:46:26

Thank you. You mentioned doing an antibody titer using comp beads. How does this compare to titration on the cells and what are the advantages and disadvantages of doing titration using comp beads?

0:46:26 - 0:48:00

The advantages are that you don’t need to collect large volumes of blood for all the fluorochromes. You can imagine 61 different antibodies with numerous titers. We wanted to make sure we had the same sample. You do need to collect a lot of blood, which some volunteers are happy to do but after a while – but with comp beads if you get the same lot number, the concentration is the same. The only disadvantage with using comp beads, they do have a lot more receptors on the surface, so the amount of antibody that will bind to them is always a lot higher so you will end up using a lot more antibodies. What we did do, we really try to make sure that the concentration that we used was constant with every comp bead. There’s pluses and minuses to both but comp bead they’re there all the time and there isn’t as much effort in preparing them as what you need to put in when you are using peripheral blood.

0:48:00 - 0:49:43

Thank you. The next one is, I like how you classify dye into brightness categories, say dim, moderate, bright, very bright. What about doing this for antigen density information, say low, medium, high density?

Yes. That is important as well. If you do have a high antigen density, you would rank the way you know what you expect or whether you are going to get a high antigen density and you would use the cells or you would use the fluorochromes that maybe aren’t as bright with some of the antigens that are bright. For example, something like CD-8 has got a very high antigen density you could use some of the dimmer dyes on something like CD-8, whereas, other markers that may have a very low antigen density, you will design some of the brighter dyes. You would also try and use the dyes that are bright and also ones that don’t give you too much of a spillover. The antigens that you’re not sure of, or that you’re testing for, and you don’t know what their expression is going to be, or ones where there is a transient expression, again you would try and assign dyes to those that don’t have too much of a spillover and are reasonably bright as well.

0:49:43 - 0:50:00

Thank you. The next one is, can you summarize the instrument characterization studies you recommend doing for any flow cytometer? Does this characterization ever need to be repeated?

0:50:00 - 0:51:18

It should be repeated every time if you have your photomultiplier tubes changed or the instrument’s realigned, you get a new laser, any of the optical components are changed, it will make a big difference. One of the older instruments that do give us a poor response as we’ve started to purchase a new filters, and we haven’t repeated the studies but once we start doing this again we’d be able to see a marked different and just show the difference between what the characteristics were before we replaced a lot of the filters and then how the instrument behaves after some of the optical components were changed. Realistically, it will - the characteristics will change as soon as you change anything on the instrument. Also, over time as I showed in some of the slides, things will degrade and the characteristics won’t be as good as what they may have been a few years prior.

0:51:18 - 0:51:31

Thank you. Looks like we have time for one more question. That will be this one. What do you think is the main reason for the differences in the staining index on different instruments of the same dye?

0:51:31 - 0:52:41

The main difference could be the laser depending on some of the yellow green dyes. If you’re exciting them with the yellow green laser then you are going to get a better response. Some instruments don’t have a yellow green laser, they’re using a 488 excitation, which isn’t as optimal for those dyes. You’re not actually exciting the dye that’s optimal absorption. The other component which I think is very important are the actual collection filters are found that they make a huge difference as to how good a signal you get depending on the quality of the filter and whether it has degraded. Also, to some effect the actual photomultiplier tube, some just have a little bit better sensitivity just setting spectral wavelengths.

0:52:41 - 0:52:51

I would like to once again thank Grace Chojnowski for her presentation. Grace, do you have any final comments?

0:52:51 - 0:53:09

No, but if any - I can’t think of any at the moment. If anybody would like to ask any questions, please feel free to contact me. My contact details are on our website, the QIMR Berghofer website.

0:53:09 - 0:53:45

Thank you and I’d like to thank the audience for joining us today and for their interesting questions. As Grace said, questions we did not have time for today will be addressed by the speaker via the contact information you provided at the time of registration. We would like to thank our sponsor, Thermo Fisher Scientific, for underwriting today’s educational webcast. This webcast can be viewed on demand through January 2019. LabRoots will alert you via email when it’s available for replay. We encourage you to share that email with your colleagues who may have missed today’s live event. Until next time. Goodbye.

Analyzing your CRISPR gene editing efficiencies using flow cytometry

In this webinar, Natasha Roark, scientist at Thermo Fisher Scientific, discusses CRISPR gene editing technology and its role in flow cytometry editing efficiencies.

00:00:00 - 00:00:21

Welcome to our webinar, "Analyzing your CRISPR Editing Efficiencies Using Flow Cytometry" presented by Biocompare and sponsored by Thermo Fisher Scientific. My name is Peter Fung. I'm the managing of Biocompare, and I'll be the moderator for today's presentation.

00:00:21 - 00:00:38

Before we begin, I'd like to inform our viewers that this event will hold a live question and answer session at the end of the presentation, and you can submit a question at any time using the Q&A box located on your screen. Also note the resource list containing documents and links pertaining to this webinar.

00:00:39 - 00:01:19

Now allow me to introduce today's presenter, Natasha Roark is a scientist in the Synthetic Biology Department at Thermo Fisher Scientific. She received her Bachelor's Degree in Science from the University of California Santa Cruz, and it was in health sciences, and molecular, cellular and developmental biology. Her current work is focused on molecular and cellular biology, high-throughput cloning, genome editing, cell engineering, and cell analysis. Today, she'll share some examples of using flow cytometry in genome editing applications and cell engineering workflows. Welcome, Natasha. We're looking forward to your presentation.

00:01:19 - 00:01:38, Slide 1

Thank you so much for having me. Today, I'm going to discuss analyzing your CRISPR editing and genome editing efficiencies using flow cytometry, and discuss how we incorporate flow cytometry and use it as a tool in our general genome editing workflow.

00:01:38 - 00:01:53, Slide 2

First, what is genome editing? Genome editing is an approach whereby we insert or modify or replace a DNA-based in a genomic DNA sequence.

00:01:53 - 00:02:50, Slide 3

Why do we use genome editing? We use genome editing to study a gene function, creating a knock-in or knockout, to create a target specific gene mutation, perhaps a SNP fraction, to targeted transgene edition or create a heritable modification. We'll also use it to label endogenous genes, stabling, integrate an expression cassette into a desired cell line, and then for general tissue and cell engineering purposes to produce novel functions. This can be used to create very powerful animal disease models. Of course, be used for high-throughput screening and tissue disease modeling. Of course, also for stem cell engineering, and even for application such gene therapy and agricultural science in which we can create disease-resistant transgenic plants.

00:02:50 - 00:03:16, Slide 4

A little background on genome editing, there are traditional methods to genome editing that you may be familiar with, a couple of these are engineered meganucleases or zinc-finger nucleases. More recently, we used transcription activator-like effector nucleases, called TAL effectors or the CRISPR/Cas system even more recently, clustered, regularly interspaced short palindromic repeats.

00:03:16 - 00:03:46

TAL and zinc-finger nucleases are similar. They’re proteins with programmable DNA-binding domains that can be engineered to bind to a specific DNA sequence allowing targeting a specific genomic mutation at a specific genomic locus. CRISPR/Cas however is an RNA guided, DNA endonuclease system that leverages short non-coding RNA to direct the Cas9 endonuclease to the DNA target site.

00:03:46 - 00:05:07, Slide 5

The CRISPR/Cas9 is truly becoming a transformative technology in the genome editing field, and as you could see by the amount of publications and applications that are now being used with CRISPR/Cas9, it's all over the news. It's had a huge impact to the entire field. A little background on CRISPR/Cas9, if you're not familiar. It's a bacterial endonuclease. It's a small RNA, a guide RNA that's actually a chimera of a target-specific CRISPR RNA, which confers to a specificity using standard Watson-Crick base pairing rules, and tracer RNA which associates with the Cas9. These together we collectively expressed in a million systems as a chimera that we referred to as a guide RNA, and these can be targeted to a genomic locus to introduce a DNA double stranded break. Virtually, anywhere that contains a pan sequence or an NGG sequence. CRISPR/Cas9 has been so readily adapted and has been such a transformative technology really because never before have we have a genome editing tool that's easier to design. It's high efficiency as CRISPR has and it's very easy to implement into any workflow.

00:05:07 - 00:06:05, Slide 6

As a little background of the CRISPR/Cas9 system. I already mentioned that CRISPR/Cas9 stands for clustered regularly interspaced short palindromic repeats. Now, where do this come from? This is a prokaryotic adaptive immune response system that converts resistance to exogenous nucleic acids, for example, phage DNA, and it's an RNA-guided DNA endonuclease system. How it works is following a viral invasion, bacteria will integrate of small piece of foreign DNA into the CRISPR loci in its chromosome. These loci are then transcribed into short CRISPR RNAs that are complementary to the previously encountered foreign DNA. This prepares the bacterium for subsequent viral attack, and it forms complexes with these Cas proteins to form effector complexes to guide detection and cleavage of the target DNA.

00:06:05 - 00:06:23, Slide 7

Precision genome editing using engineered nucleases all follow the same basic principle, and lots of these engineered nucleases use a double strand break in the specific location in the genome. These double stranded breaks are then repaired by the endogenous cellular machinery.

00:06:23 - 00:06:35

Here we have an example of the double stranded piece of DNA. We can add our engineered nuclease, the zinc-finger nuclease, the TALEN or CRISPR Cas9, and we get a resulting double stranded break.

00:06:35 - 00:06:48

Now, once this double stranded break occurs, they can go through two different pathways. We have non-homologous end joining and homologous with combination.

00:06:48 - 00:06:48

Now, these two different pathways act together and often conflict with one another, these different enzymes utilize different proteins for this type of repair. Non-homologous end joining is definitely the most prevalent. It occurs at a much higher frequency, and this is a very random process. As the endogenous cellular machinery repairs this double stranded break, it's messy and it generally results in very large indels, large insertions or deletions of DNA at the target locus. This is really generally used to create a knockout; to knock out a gene, to knock out a protein of interest. Alternatively, homologous or combination is much more specific. In this case, we actually introduce a donor DNA with our genome editing tool, and this allows us to create perhaps a very precise knockout, perform a gene fraction, a SNP, for example; a small deletion, insertion, or a knock-in, and we indeed use it for all of these applications. Mutating a promoter, adding a promoter, adding a gene or even adding an endogenous tag.

00:06:48 - 00:10:01, Slide 8

One of the methods that we use to assay our genomic cleavage detection is GeneArt Genomic Cleavage Detection Kit. This is a very quick and relatively straightforward and very quantifiable way to assay your genome editing efficiency at a given locus of the actual endogenous DNA. This is an essential tool for monitoring the efficiency and it's much quicker and much less expensive than sequencing, for example, this would be quick check of your actual gene modification. How this works is we transfect our cells of interest with our genome editing tool desired, this could be the GeneArt CRISPR nuclease factor in this example. Once we transfect these cells and allow the tool to work, we then harvest these cells, pellet them down, and create a cell lysate. We then designed target specific primers linking that region of interest. So we’ll amplify that region and that indel containing that – that region of interest containing that modification and of course, the indel that was created by non-homologous end joining. After PCR amplification, we then denature and re-anneal this PCR amplicon to form heteroduplexes, which creates mismatches in this DNA, which can then be cleaved by our mismatch detection enzyme, and then ran out on a gel. You can see here from this animation, this image, that this is a very quantifiable and very quick and accurately to measure genome modification efficiency. It’s easy, takes about five hours start to finish and it's actually quantitative. We've confirmed this with the next-gen sequencing.

00:10:01 - 00:10:44, Slide 9

Today, I'm going to take you through this general cell engineering workflow, which we really strive to address from start to finish in my department. That's the premise of designing, delivering and detecting. We start by identifying our target sites, designing and ordering the appropriate tools, delivering these editing tools to our healthy cells with the new high transfection efficiency, and then detecting, validating, quantifying our genome editing efficiency or even our knockout efficiency through various products that we're creating or various internal assays that we use.

00:10:44 - 00:12:41

I have few examples here of the variety of the products that we use in our day to day workflow and that we also offer in our workflow as far as options for cell engineering. Beginning with design, we would start with the GeneArt Crystal Search and Design Tool portal where we insert your query, our genomic DNA of interest. Binary optimal CRISPRs, which is based on our proprietary algorithm using the most updated rules and principles we know about guide RNA design. Once we choose our guide RNAs, we can choose delivery depending on our application. We may want a vector, in which case we could use the GeneArt CRISPR nuclease vector. We may want IVT using the GeneArt precision guide RNA synthesis kit, or maybe we want stable integration, in which case, we've used our Invitrogen LentiArray CRISPR Libraries. Then for delivery, we have a variety of options as well. We'll use Neon transfection, lipid-based methods, again Lenti as the system for delivery, GeneArt Platinum Cas9 nuclease, but what I really want to focus on most today is the detection, the confirmation of this genome editing, and different ways that we use flow cytometry and imaging techniques to help us assay and quantify our genome editing and improve our genome editing conditions. Here I'm going to talk about how we use the Attune NxT Flow Cytometer, how we use our GeneArt Genomic Cleavage Selection Kit, we use EVOS Imaging System, and of course, in the end, all of these methods are verified using Ion PGM or Sanger Sequencing.

00:12:41 - 00:13:02, Slide 10

First, I'm going to talk about our genomic cleavage selection reporter and how we can use that with TAL effectors or CRISPR guide RNA. This is a nice validation tool that we have for genome editing that really have a lot of versatility and it's very - quite simple to use.

00:13:02 - 00:13:44

On the left here, we have the Genomic Cleavage Selection Vector. This vector consists of a promoter, a V5 tag, and an OFP coding sequence for orange fluorescent protein that's disrupted by the target sequence binding site. So, in this sequence in between the OFP will have either the TALEN-binding site or the CRISPR guide RNA binding site cloned into the vector. This vector comes linearized and we just order oligos with the correct overhangs, phone them in, and now we have a disrupted OFP, fire target sequence as well as a T2A self-cleaving peptide with the CD4 coding sequence.

00:13:44 - 00:14:18

Now that we have this vector, we can co-transfect this reporter vector with nucleases, CRISPR or TALEN. Upon successful cleavage by the reporter nuclease as well as the endogenous sizer repair mechanisms repairing this double-stranded break. Similarly, we do so on the endogenous DNA, now we will have OFP+ cells, so fluorescing orange cells as well as CD4+ cells being expressed on the membrane.

00:14:18 - 00:14:57

Now, the nice thing about this system is it's very simple to design and construct and it’s very quick. Within 20 hours of your transfection, you can generally see OFP expression under the fluorescent microscope, and that will really tell you three things right away. It will tell you that your design is correct, that your genome editing tool and your vector are entering the cells and give you some idea to the efficiency to which it’s entering and cleaving, and it will also tell you that you designed these properly, which is key.

00:14:57 - 00:17:41

There are sometimes issues with expression of CRISPR. There are promoter constraints. There are a variety of things that can affect the efficiency. This actually provides a way to assay for our genome editing tool. You could check several different CRISPRs using this and then have a very quick and rapid method for which you assay the cleavage efficiency of the CRISPRs using flow cytometry. You can see in this panel here, this is exactly what we did, that we are showing three different cell lines here, 293SPs, U2OS and A549 cells, as well as three different targets; AAVS1 and HPRT safe harbor loci, and the IP3R2 locus. On the top panel, we co-transfected this reporter with the AAVS1 guide RNA or HPRT1 guide RNA, and also co-transfected with an irrelevant guide RNA to take into account background subtraction or any auto-fluorescence resulting, and then measured the OFP expression using flow cytometry. Got a nice quantifiable result for how well these cleavage tools are working. On the bottom channel, the same experiment was done but using TALENs. You can see these are slightly lower efficiency and they also are, while still worker cell lines, not quite as robust in their OFP expressions. This actually correlates quite nicely with genomic cleavage detection and sequencing methods. Of course is exogenous DNA. This is exactly what it's called. It’s a surrogate reporter and you're not going to have consideration such as the accessibility of the locus, for example, and it's not going to be able to tell you if you have a bi-allelic knockout, but it's a very quick and easy and quantifiable way to look at your genome editing tool in the cell.

00:17:41 - 00:17:52, Slide 11

It also offers that added benefit of being able to enrich. Oftentimes, we're working with very difficult to transfect online, maybe difficult loci. CRISPRs are pan limited. You are limited by that proto spacer motif, and sometimes there are not as many options as you would like and you may have get the mutation, the targeted mutation that you want to go with the CRISPR with slightly less efficiency, and you really want to be able to do everything you can to enrich for these edited cells because one of the biggest bottlenecks in the workflow is taking these clones down, isolating these single clones and carrying these out, and then performing sequencing on them, for example.

00:17:52 - 00:19:12

This provides a method to enrich for these cells. We can either use Dynabeads CD4 base for enrichment in which we use a CD4 coded Dynabeads and pull these cells out of solution using magnets, or we can use fluorescent cell sorting to sort for OFP+ cells. This shows - that’s exactly what we did here. You can see this gel below. This is the genomic cleavage detection assay of cells pre and post enrichment. Enrichment with OFP and CD4. In Lane 1, you can see that you have virtually no genomic cleavage as quantifiable by the gel. In Lane 2, there’s a flow through sample from the CD4 beads, but then Lanes 3 and 4, you can see now that this efficiency is now 19.3% and 13.9% respectively, which is greater than 10-fold enrichment. Now in this case, 19 and 13 still are relatively low, but this now makes a nearly impossible experiment as far as the amount of work and plates that it would take to carry out this clone into an experiment that's very workable and very doable.

00:19:12 - 00:19:44, Slide 12

We've also quantified all these different methods. Here's an example of analyzing genome editing efficiencies detected by the Genomic Cleavage Selection Kit, the Genomic Cleavage Detection Kit, and NGS sequencing. In this workflow, we designed perfect match TALENs, transfected the cells with the Neon transfection system, and then use the Genomic Cleavage Selection Kit as well as amplify the target loci from the cell lysate.

00:19:44 - 00:20:38

Then we took these through Ion PGM Sequencing and the Genomic Cleavage Detection Kit. Now, in this case, you can see that the Genomic Cleavage Detection and the NGS data really correlates quite nicely with one another. The GCS vector is also correct. It is slightly lower and we do see this in different cell lines. Again, this is a reporter. It's going to depend on the cell health, how well this cell expresses OFP, and again, it’s not able to detect a bi-allelic knockout. In some cases, it will overestimate your genomic cleavage. In other cases, it will underestimate. It's really just meant as a means for assaying those.

00:20:38 - 00:22:26, Slide 13

Another method that we have where we use flow cytometry to help our genome editing workflow is with our GeneArt CRISPR Nuclease Vectors. We use this for genome editing and transfection efficiency monitoring simultaneously. We have GeneArt CRISPR Nuclease Vector kits. It's cloning very similar to the GCS Vector that I previously described, but in this case, you're actually going to clone and/or guide RNA sequence. There's a U6 promoter. There's also a Cas9 and then you have the choice of either CD4 or OFP being expressed in this vector. Now, this provides another opportunity for analysis and enrichment. You can see on the top panel. You can see this and very quickly get an idea of your transfection efficiency at the same time that you are delivering your genome editing tool, and you can analyze that via flow cytometry, use it to optimize conditions. Similar to the GCS vector, you can also sort these cells. In this example here, I have a pre-sort population of 9.5% of cells were OFP+. Sorted these on OFP+ cells and ended up with a post-sort OFP+ population of 99%. Saved these samples and ran the Genomic Cleavage Detection kit on these and you can see that pre- and post-sort, I got almost 50-fold enrichment, from 1.4% to 48.1% cleavage. This is again a very valuable tool particularly when we're looking at very difficult to transfect cells, or again cells that are very limited as far as our options for genome editing tool.

00:22:26 - 00:22:57, Slide 14

Again, this is also available on CD4. This is just an example. The CD4 Vector is also quite versatile. We can conjugate this with the CD4 conjugated Alexa Fluor 488 antibody, which is what I did here, and analyze this on the EVOS, ran flow cytometry and then also the Genomic Cleavage Detection Kit. This gives us ways, too, for correlating transfection efficiency with cleavage efficiency evaluating different tools and assay optimization.

00:22:57 - 00:26:09, Slide 15

I'm going to switch gears a little bit and talk a little less about delivery, but more on some of the general tools that we use flow cytometry for in the lab. Once we get our edited clone, we really need to validate just like any cell line. This is the cell line we're going to use, we're going to keep, we're going to provide to customers, we're going to do a tremendous amount of downstream activity on these cell lines, and having high sensitivity and resolution of those fluorescent proteins and dyes is very, very important as far as our option for working with different fluorescent proteins and dyes. Because we're genome editing, we're looking at mainly cell expression or gene modulation so we work with fluorescent proteins. Fluorescent proteins can be a little bit difficult to work with in that they have very large emission spectrum. They're excited by multiple lasers. There's a lot of bleed-through, and they can be a little more difficult to resolve than dyes. Furthermore, we want to be able to resolve these with dyes. So, here's an example of some of the fluorescent proteins that we use: MKD, TAG, BFP, OSP, Emerald GFP, and RFP. We regularly use these in conjunction with viability dyes, and we use these to validate our cell lines that we're making, and it's really increased our options as far as creating these different cell lines because we have four lasers to work with and 14 possible colors. So, you can see in the example above, this is a cell line I was making. It's a U2OS NF-kB GFP fusion protein that’s also stably expressing Cas9. So, in doing this, I wanted to screen several different clones and pick the best one. You can see that not only can I very clearly see the separation between this dimly expression GFP and the negative sample, but I can also distinguish SYTOX positive cells from GFP positive cells. It's really that sensitive. Additionally, in the bottom panel, you can see the same panel, from left to right, we have the U2OS Cas9 cells. The U2OS NF-kB GFP Cas9 cells, and then treated with SYTOX Orange. There's really no overlaps between the two panels and you can very clearly distinguish SYTOX Orange and GFP+ cells, as well as SYTOX Green with GFP+ cells. This really offers us a lot of different options because we make a lot of reporters using a variety of different fluorescent proteins, and it's really nice not to be limited by our analysis tool for this detection.

00:26:09 - 00:27:25, Slide 16

As I spoke to you before, we use it for clone selection and validation as well. As we're creating these stable cell lines, often I'm screening dozens upon dozens of clones and wanting to choose the best ones. The high-throughput capabilities of the Attune [NxT Flow Cytometer] and really how fast and how easy it is to use and how straightforward the workflow, it makes a huge difference in our workflow, and that I can run a 96-well plate, screen 96 clones in about 30 minutes with virtually no sample prep, just briefly trypsinize the cell, quench when the FACS buffer, and then put in the Autosampler, and 30 minutes later, I have my results. This here is an example of working with the same cell line, taking the clone that I wanted for my desired mutation. You can see in green, I highlighted what I found to be the best one; H11 on the far right, given that it has the lowest SYTOX Blue-positive population, as well as the highest GFP+ population, most homogenous cell lines.

00:27:25 - 00:29:32, Slide 17

Another method that I used flow cytometry for, and we employ in our group on a regular basis, is for production of our Invitrogen LentiArray CRISPR Libraries. Currently, one of our new product offerings are LentiArray CRISPR Libraries and you can see the plate configuration on the right. This consists of four guide RNAs per gene, one gene per well, and 80 genes per plate. These array libraries are separated by family. For example, we have the kina, we have the protein receptors, we have the phosphatase library and so on and so forth. So, we’re providing this not only as glycerol stock ready for DNA preparations, but also as a high-titer arrayed virus for screening. Every gene of every library actually goes through an antibiotic selection titer to confirm that it meets our product specifications. This is an extremely time-consuming and laborious process, so we use GFP to optimize this titering process, as well as to provide some guidance for the customer. These are controls that we also sell that are available for sale to the customer, and that we use internally to optimize our transfection conditions, our transfection conditions, our antibiotic selection conditions. We can also use as negative or positive controls from gene modification, and we use it as a high-throughput titer check of our Invitrogen LentiArray CRISPR Libraries. You can see the vector appear on top. There’s a U6 promoter driving the guide RNA expression, which is cloned in for each site, and then we also have in this vector, in the control vector, GFP and puromycin. The GFP to aid us in titering and then the puromycin for selection.

00:29:32 - 00:29:58

We have two different particles that we provide. One is a positive control. This has – expresses the guide RNA targeting the HPRT sites. This is a high cleavage, high activity, safe harbor site, that the customer can use as an actual control for genome editing, genome modification, and then we have a negative control particle, which expresses a scrambled guide RNA, which matches nowhere in the human genome.

00:29:58 - 00:31:15, Slide 18

Now I mentioned we do an antibiotic selection of every well in this library, but again that’s very time-consuming and very laborious, and we want to have a quick check that we have internally before we go through that entire process of titering every single well. So our general workflow is this: We seed ourselves, then we transduce ourselves with our newly made lentivirus, change the media on day three, and then as early as day five, we can perform high throughput GFP titration using the Attune [NxT Flow Cytometer]. Again, it takes half an hour per plate, and yes, I think that’s a very quick, very quantifiable titer check before moving 10 days down into the workflow through antibiotic selection. You can see on the bottom left, this is an example the plate map provided by the Attune [NxT Flow Cytometer], it gives us a very quick picture of our titer throughout the wells, our different dilutions - throughout the plate, rather. Then on the right, you can see our calculation for the titer of this plate. We’ve developed a calculation for the titer which is the percent of GFP+ cells times the cell number and dilution factor divided by the volume of viral inoculum.

00:31:15 - 00:31:46

Again, this is very quick, this is just taking the cells, trypsinizing them, quenching them and running. We have three Attunes that we use and these run on a titering day all day long. One great benefit is we see very, very little clogging or very few problems with that even though we’re going directly from the plate with very minimal cell preparation.

00:31:46 - 00:33:10, Slide 19

Now, I’m going to talk about how we use flow cytometry to actually assay major our GFP knockdown efficiency. I talked about genome editing and about target reporters and titer checks and transfection optimization, but we also use flow cytometry to actually measure the knockdown, which is really the goal of these libraries that we’re creating. So, we’ve designed several different assays that will help us to do that, and here’s one. We have a HEK293 GripTite GFP stable cell lines, which is stably expressing GFP. Then, using our design algorithms, design, clone and produce lengthy CRISPR guide RNA viral particles targeting this GFP. We then co-infected these with Cas9 and NF3, followed by antibiotic selection for three, seven or 14 days, and then at each time point, analyze the GFP knockdown efficiency, genome modification efficiency at each time point. You can see up here, here’s an example of the triplicate sample plate map of one of the selection times, the fluorescent microscopy knockdown, and an example of the genomic cleavage detection at this point.

00:33:10 - 00:35:09, Slide 20

So, here in the middle, we have the overlay flats of each of these different guides. Let’s start with our cytometry data on top. We have three, seven and 14 days moving down. In the last panel, we have target one, target two, target three and target four, and then on the right panel, we have both the GFP pool and the scramble pool. The GFP pool is a pool of targets one, three and four to simulate the design of our libraries, and the scrambled pool is a pool of different scramble, three different scramble guide RNA’s that also match nowhere in the genome, and we went ahead and infected these cells and assayed them based on a single guide, as well as pool guides. You can see they have the overlay plots that these decrease over time, but in very different rates. So, on the left, I’m looking at the ANOVA percent GFP, and you can investigate the selection conditions saying that three, seven and 14 days, this changes quite a bit for some targets but less for others as far as the GFP efficiency. Of course, everything increases finally at 14 days, but we want to make things as quick as possible, and based on the data from this experiment, seven days seems just fine to get to get maximal selections. On the right, you can see a ANOVA analysis of our genomic cleavage efficiency in this experiment, and while the genomic cleavage correlates, it’s not the same as looking at the actual knockdown. There’s a few mysteries here wondering why target three, for example, gives a very decent genomic cleavage, but really has much left knockdown than the others.

00:35:09 - 00:36:34, Slide 21

So, breaking this up by target and then looking at data selection at each target, you can see that target one, the GFP knockdown increases dramatically at seven days post-selection and then modestly to 14. Target three goes through a step-wise GFP decrease, and target four really drops at seven and almost to nothing and virtually stays there. Similarly, the pool targets drop dramatically at seven and stays there. The other information that can be gleamed from this is that our pool design is really quite good, and that the pool guide RNA targeting a specific gene should be as good as the greatest genome editing tool in the pool, and that’s one of the reasons we like to design four per gene, so it really maximizes your efficiency of getting that knockout. Because as we’re learning more and more about CRISPR, that these targets, actually these non-homologous end join that results, actually does happen in a pretty predictable matter. We just haven’t completely elucidated what that is to correctly make that prediction. So, this definitely increases our ability to get a gene knockout over time.

00:36:34 - 00:37:17

Looking at the pooled analysis, you can see that the GFP knockdown for each target is quite different, but it really varies more by target than it does by data selection so you can see on the right hand histogram here, that really, there’s no – when pulling all the data together because we really want to use this as an example to provide guidance for customers and guidance to ourselves in optimizing these assay conditions, that seven days appears more than sufficient to get the desired gene knockdown, and base it with this model.

00:37:17 - 00:38:45, Slide 22

Okay, so in addition to using flow cytometry to verify our knockout efficiency, we can also use flow cytometry and these fluorescent proteins to verify homologous recombination. So, in this particular experimental setup, we take advantage of the sequence homology between BFP, blue fluorescent protein and GFP, green fluorescent protein. These differ really in this specific sequence for them in just one nucleotide. As you can see from the panel on the right in the control, so we have a cell line that is expressing BFP, and when transfected with the IBT guide RNA and the Cas9 mRNA, we see a restoration of this GFP. Now we have an assay for these quantifiable and verifiable for HR efficiency and have a nice reporter assay for this, and the other nice thing is it’s stably integrated, so we can now have this stably expressing BFP cell line and use it to optimize all sorts of assay conditions. You can see from the video down here playing on the right, over time, that you can very easily see the GFP and the homologous recombination restoration of this GFP over time.

00:38:45 - 00:41:23, Slide 23

Now, using that assay, we can further investigate homologous recombination. So now, here are two of the different assays we have developed for assaying homologous recombination. One is the conversion of BFP to GFP using that one nucleotide difference. Another is creation of a GFP mutant and then restoring that GFP expression with the donor. So, you can see the designs up here on the top, the six nucleotide change on the left, and the one nucleotide change on the right, and the fluorescent images when we transfect with just the Cas9 RNP, that’s the Cas9 protein with the Cas9 donor, and then the Cas9 RNP with donor. You can see that we get very efficient homologous recombination and we can use flow cytometry to assay this, which we did down below. After we developed this, we then further investigated this and tried many different conditions to very quickly and in a high throughput manner, analyze and quantify the methods that most effectively support homologous recombination. So, you can see this is all GFP data. For flow cytometry, we co-transfected with subs and in a sense, oligos to see which one of them is more effective. We also try sequential delivery meaning starting with the RNP followed by the donor DNA and then also by the donor DNA followed by RNP, and then did a dose response. Did the similar thing, similar experiment, on the right, this is the one nucleotide that I previously described, one nucleotide change that we used. In this case, we see the BFP convert to GFP and use this to also optimize sequential delivery, and dose response – sorry oligonucleotide and RNP dosage for electroporation. Again, electroporation, that’s another thing that we had to optimize, and that varies greatly and it’s going to produce very different efficiencies, based on the different cell lines and based on the type of mutation, targeted mutation you’re trying to make. We use all of these as assays to optimize our conditions before moving on to an endogenous gene of interest.

00:41:23 - 00:42:09, Slide 24

These are some of the flow cytometry applications we use at our genome editing and cell engineering workflow. I had talked about how we used fluorescent surrogate reporters and genomic cleavage, how we actually use these reporter-type systems to deliver and monitor the transfection efficiency of genome editing tools, how we can use these to enrich for genetically modified cells, how we can use this for a high throughput viral tittering and workflow optimization. I talked a little bit about knockout screens and cell line validations, and really the next thing that we’re focusing on is on using flow cytometry to perform various phenotypic screens.

00:42:09 - 00:42:54, Slide 25

Our goal, again, is really to create complete and optimized validation solutions for at these genome editing workflows from start to finish, from design to validation and flow cytometry is an integral part of this entire workflow, not only being used for analysis and assay optimization, QC checks for our products, but it’s also a means to enrich for ourselves and validate our chosen cells, via proliferation assays or LIVE/DEAD cell stains, and so on and so forth.

00:42:54 - 00:43:11, Slide 26

So, I’d like to acknowledge my team. This is our site, Thermo Fisher Scientific in Carlsbad, and this data was all compiled by the Synthetic Biology R&D team.

00:43:11 - 00:43:24, Slide 27

Thanks so much for listening and I’ll take your questions.

00:43:24 - 00:43:28, Slide 28

Thank you so much, Natasha, for a great presentation. We’ll now begin our question and answer portion of the webinar.

00:43:28 - 00:43:46, Question 1

Our first question is: Does Thermo offer a BFP or mutated GFP cell line?

We do not currently offer these as commercial products at this point. No.

00:43:46 - 00:44:48, Question 2

Okay. We have another question here is: Why was CD4 and OFP chosen over others?

We actually made, during product development, a variety of different types of reporters with different fluorescent proteins and different membrane protein expressions. We chose the OSP and CD4 because they worked quite well as well as additional internal business reasons for going with those in particular. OFP was a nice choice because it’s very bright and it’s compatible with most detection methods. You can see these lay under an EVOS, you can see it no problem with the 488 laser, and additionally, CD4, we had such a nice workflow already set up for that with our Dynabeads kit.

00:44:48 - 00:46:04, Question 3

Great, okay. Let’s see. We have another question here is: Have you tested this protocol in stem cell and how easy is transfecting them with CRISPR? If it’s hard, how can we make it easier? Can you elaborate a little?

Yes. Yes, I absolutely can. That was one of our biggest challenges easy on, early on as far as getting high efficiency in stem cells. We have now been able to get very high efficiency in stem cells using both electroporation and standard lipid methods. It is still quite difficult and you have that added element after the workflow that you need to prove that there’s still stem cells after the fact. They are more difficult to totally isolate, thus a big impedance to the workflow, and they’re not particularly easy to sort, but we had actually used all of our tools on stem cells. We get very, very good results using our Cas9 protein in stem cells with our IDP guide RNA. We also have very nice results with Cas9 mRNA using both electroporation or lipid-based methods.

00:46:04 - 00:46:33, Question 4

Okay, great. We have another question here as: Does this method work for detecting CRISPR mediated, mutated transgenic plant instead of plant cell?

I cannot speak specifically to the transgenic plants, specifically. So you’re referring to which method? The genomic cleavage detection kit?

00:46:33 - 00:47:06, Question 5

I’m assuming so, yes. They didn’t elaborate.

Yes. Yes, let me - I believe we have some plant data. If you’re interested in knowing more about that, I’m going to say I’m not the plant expert of the group, but I believe we do have some plant data and feel free to send me an email and I can see what I can find for you, more specifically, but yes, we have worked on plants, and actually done a lot with TALENs in plants with some of our collaborators.

00:47:06 - 00:48:21, Question 6

Great. Great information. Okay. Another question is: Is there a difference between hESC cells and iPSC cells transfection procedures?

Yes, there are. It’s very interesting because we see it vary greatly from target to target. Originally we actually saw that iPSC were actually quite easier to edit than hESC, but as we further investigated that and compiled more and more data, we definitely see that it really is target-specific. iPSC’s, there are a variety of other reasons that make them a little more easy to work with than hESC’s, but we do have our resident stem cell expert, so if you have any further questions or you’d like me to elaborate, again, feel free to send me an email and I can put you in contact and see what data we can share regarding hESC versus iPSC genome editing.

00:48:21 - 00:48:42, Question 7

Okay, great. We have another question here is: Transfected cells are quite a bit heterogeneous. Any effort you’ve put into producing more homogenous engineered cell?

Yes. Well you mean as far as the nature of the indel produced?

00:48:42 - 0:49:51, Question 8

I’m assuming that’s what they’re asking. Yes.

Okay. I mean, yes – yes, we do. The number one thing is really improving the transfection efficiency and getting that as high as possible so we have the best population to work with prior to carrying out the clones and doing the deep sequencing and analysis. We are seeing that, again, its quite messy, but certain CRISPR’s tend to give more characteristics - or certain guide RNAs, I should say - tend to get more characteristic mutations. So, as far as getting a more homogenous cell population, one is optimize for transfection efficiency, get it as high as possible, and then often sort, because you’ll see some populations within these, within the transfected cells, sort from the most homogenous populations possible, and then when you get to sequence confirmation, it removes a little bit of that noise to make characterization and homogeneity a little bit better.

00:49:51 - 00:50:39, Question 9

Great and we have another question here is what software do you use for your flow analysis?

Honestly, we use internal Attune [NxT] software more often than not. Of course, sometimes we’ll use FlowJo; we have FlowJo and use that, but really the majority of our analysis, we have everything set up, all of our protocols stays, and we actually really find the most efficient, easiest way is to have our – use the Attune [NxT] software because it’s very quick, all of our plate maps are set up, we have the data very quickly that we want to present, keep in our notebooks, as well as all of our statistical data ready to go, and export from there.

00:50:39 - 00:52:07, Question 10

Okay, great. We have another question in. We have a lot of good questions coming, is how do I target multiple sites within a genome at the same time using the CRISPR system?

We actually have some really, really nice data for that and I believe it is all sharable than I can provide based on a paper that we published last year from our group. Yes. We see excellent efficiencies targeting multiple different areas in the genome by just co-transfecting with multiple guides and Cas9’s, and then assaying those different loci. So, in one example of the experiments we did, we did four guys, we co-transfected those together, we did a quick genomic cleavage detection assay on a single versus combined guys by amplifying each target loci individually, and then we went ahead and looked at the ones that contained all four of those mutations, and we’ve seen really nice multiplexing results. So, the short answer is yes, it’s very doable. Your efficiency, of course, is going to go down just statistically. If you have one giving 70%, maybe all four will be much smaller, maybe 30, 40%, but it can be quite good.

00:52:07 - 00:53:43, Question 11

Okay, great, and we have another question here is do you measure other contributing metrics to expressions such as cell health, signaling or function?

Yes, we sure do. This is a part of our research that we are really growing in and expanding right now ourselves. Of course, as I mentioned, in analyzing these clones, we use a lot of assays, a lot of it is pretty straightforward. We look at the proliferation rates, that’s one that I do a lot in validation of these stable cell lines that have been edited, so it’s using our CFSE Violet. I use a lot of the SYTOX LIVE/DEAD stain, as well as some of our Annexin and Caspase kits. Those are all related to viability, but then we also have, depending on the target that we’re knocking out, if we’re knocking out a receptor, we do have then other specific methods for looking at that. Now that we are making these libraries, we are moving much more into doing a lot more phenotypic assays. I hope to provide some nice application data for some of the things we’re working on there now. Those, of course, are a lot more specific because it depends on a specific target and choosing the correct antibody assay that using flow cytometry. We use Western and deep sequencing, really our qPCR, and a variety of different methods for that.

00:53:43 - 00:55:19, Question 12

Great. We have another question here is are there any design strategies you can use to reduce off-target effect?

Yes, absolutely. If you want to look at our CRISPR search and design tool, it’s very nice. This was created internally based on all the papers that were available based on what we know collectively on/off-target analysis and there are rules for running the best off-target. There are a lot of different rules that we see, and that we implement and that have been incorporated into this algorithm. So rather than explaining or going over each and every tool, which I do have some more information, though, you’re more than welcome to message me if you’d like more detail, but this tool that we’ve developed actually already has that. You can input your sequence and then it gives you a score, and that score is based on the off-target analysis. So, if you want to design four CRISPR’s for example, it’ll give you a list and it will rank those different CRISPRs on a percentage based on a variety of different parameters. One of the highest contributing factors is off-target. We also know some things about what makes a CRISPR a little more active and we’ve incorporated those as well, but my recommendation would be to definitely use our tool or another tool, there are a variety out there that can help improve that.

00:55:19 - 00:57:28, Question 13

Great and we have another, it’s a little more general question here. Researchers are having some problems to introduce specific point mutations in their cell lines, is there a method that would have good efficiency that you could recommend or maybe talk about your experiences?

I could talk a little more in detail about us trying to introduce these SNPs, so that’s part of the things that we use HR for. In the last slide that I talked about that was shown on the presentation looking at the GFP to BFP conversion, or the case of restoring the destructive GFP. Our recommendation with that is to use homologous recombination rather than relying on non-homologous end joining, and co-transfecting with donor DNA that’s going to result in a point mutation. Now, homologous recombination, traditionally is more difficult and we see much lower efficiencies than NATJ, but as you can see, that’s one of the things that we’re trying to work on as far as delivering the most optimal conditions and looking at every single aspect of that workflow to make it easier and increase the efficiency. As you can see, we’ve gotten – we’ve been able to increase it quite a bit. It’s always going to be, though, I mean we’re looking at a fluorescent reporter. Of course, the analysis and validation downstream does become more difficult when you’re looking at endogenous gene because you’re going to have to sequence those out at the end, but I always recommend optimizing with your cell lines, with your target if possible, optimizing all the conditions before moving to that step of actually moving forward.

00:57:28 - 00:57:56, Question 14

Great. We have another question here on – could you detect the level of a knockout? For instance, this researcher is working with a polyphenol oxidative transcripts and they mentioned that they can compare the degree of knockout to determine what’s the highest expression. Can you speak to that a little bit?

I’m not entirely sure if I understand the question. Are they talking about verifying the knockout using the expressions of the genes, using a downstream method?

00:57:56 - 00:5:51, Question 15

That’s what I’m assuming as well is just monitoring the level of how effective their knockout was. That’s how I would interpret their question.

Okay. So, anytime you have something like that in order to relatively easily assay the knockdown, I think that’s excellent. It’s not always that easy, but if you’re trying to get the best knockdown, what I would suggest if there are questions related to pertaining to experimental design, we’d try several different guide RNAs, and then perhaps even a pool of guide RNA and then downstream, measure the expression, or measure the knockdown at several different cases, and then find the best guide RNA or the best pooled conditions to maximize that knockdown efficiency.

00:5:51 - 00:59:43, Question 16

Great. Let’s see, we have another question here is which repair pathway would you recommend to repair CRISPR-mediated double-strand breaks?

Well, they both - they’ll both repair these breaks. Non-homologous end joining is much more efficient. So, if you’re looking for a knockout or a bio knockout, then I would definitely recommend, you could use NHEJ, if you’re more concerned with knockdown and getting knocked down in very, very high efficiency. Homologous recombination is generally, if you want something a little more specific or if you’re trying to very specifically add or delete a gene or correct a SNP, for example.

00:59:43 - 01:01:16, Question 17

Great and just one more, we have time for one more really quick question here is: I see that you offer a variety of platforms for CRISPR-Cas9 genome editing, what is your recommended platform for the highest editing efficiency?

Okay, so this definitely does depend on application, but the short answer is we see the highest efficiencies, lowest toxicities across the board using our CRISPR Platinum Cas9 nuclease, that’s the RNP platform combined with the in-vitro transcribed guide RNA. That would be recommended for transient transfection or genome editing, but we also have Lenti. Our Lenti libraries which are primarily used for screening, the Lenti stably integrates the guide RNA, so that’s going to get very high efficiency, and that you’re able to select from the guide RNA containing cells and select for that knockout over time, where the guide RNAs continuously pressing, which was, I think, shown in my third to last slides. So that produces very, very high knockout efficiency as well. You’d like to use Lenti, that’s generally used a little more for screening rather than transfection. There are other considerations where plasmid and mRNA are also very nice tools, but definitely the highest was the RNP-IVP combo.

01:01:16 - 01:01:36

Great. Unfortunately, that’s all the time we have today for the presentation. Any questions we have particularly - thank you everyone for sending in your questions. We got a lot of questions about cost and availability of products from Thermo. We’ll be able to follow up - our presenter will follow up in email response to you directly to answer your questions.

01:01:36 - 01:02:34

First of all, we’d like to thank Natasha Roark for sharing her knowledge with us today and giving a great presentation, and also offer a special thank you to Thermo Fisher Scientific for sponsoring today’s event. Please keep a lookout for an email containing a link to the on-demand version of this webinar. Thank you everyone for attending and have a great rest of your day.

Advancing immuno-oncology discovery and development with flow cytometry

In this webinar, Chris Langsdorf, scientist at Thermo Fisher Scientific, discusses how flow cytometry can effectively be employed in research efforts aimed at advancing the discovery and development of various immunotherapy strategies.

Cancer stem cells and mechanisms of multidrug resistance by flow cytometry

In this webinar, Dr. Jordi Petriz (The Jose Carreras Leukemia Research Institute in Barcelona) discusses how flow cytometry can be a key player in investigating cancer stem cells. The ability of normal and tumor stem cells to express transporter molecules that function to exclude anticancer drugs and certain dye molecules can potentially be exploited to identify and isolate cancer stem cells by flow cytometry.

00:00 - 02:52, Slide 1

Hello and welcome. I'm Gwen Taylor, Senior Developmental Editor with Current Protocols at John Wiley & Sons. I'm delighted to introduce today’s webinar, titled: “Cancer Stem Cells and Mechanisms of Multi-Drug Resistance by Flow Cytometry”. This webinar is being co-sponsored by Current Protocols and Thermo Fisher Scientific.

Thermo Fisher Scientific is the world leader in serving science. Their mission is to enable their customers to make the world healthier, cleaner and safer. They help their customers accelerate life science’s research, solve complex analytical challenges, improve diagnostics, and increase laboratory productivity. Through their premier brands, Thermo Scientific, Applied Biosystems, Invitrogen, Fisher Scientific, and Unity Lab Services, they offer an unmatched combination of innovative technologies, purchasing convenience and comprehensive support.

Current Protocols is in its 29th year and is the largest collection of peer-reviewed, authoritative and regularly updated step by step research protocols available for life scientists worldwide. Now with 18 titles and over 17,000 protocols, Current Protocols is part of Wiley Publishers. During today’s program, we encourage you to submit your questions throughout the event by clicking on the Ask A Question box at the bottom of your screen. Your questions will not be seen by any of the other attendees. The webinar will be recorded and available for viewing in the next few days and we will send you an email with details on how to access the on-demand webinar along with the customizable certificate of attendance.

Now, it is my pleasure to introduce today’s speaker. Dr. Jordi Petriz received his BSc Degree in Biochemistry and Animal Biology from the University of Barcelona and then pursued his PhD at Barcelona in Physiology and Immunology specializing in functional-based mechanisms of multidrug resistance against anti-cancer agents. Since 2000, he has been a member of the Subcommittee on Quality Assessment of Hematopoietic Stem Cell Grafts and is a principal investigator at the Josep Carreras Leukemia Research Institute. Dr. Petriz have authored nearly 100 research publications and his current work focuses on linking stemness with ABC transporters. He is studying several genes involved in different aspects of stem cell activation, including some that encode for multidrug resistance transporters and others that regulate self-renewal and differentiation. He has also been the president of the Iberian Society for Cytometry since 2007. So, let’s go ahead and get started with a very warm welcome to you, Dr. Petriz. 

02:52 – 03:35, Slide 2

Thank you very much for this nice introduction. First of all, I would like to thank Wiley and Thermo Fisher for inviting me to participate in this webinar meeting on flow cytometry. My experience in the flow cytometry field starts in 1991 and my talk will focus on the study of the stem cells mainly apply to learn more from hematopoietic system. My PhD was in the field of functional flow cytometry and multidrug resistance, and most of this knowledge has applied to stem cell studies. More recently, we have explored the effect of cellular hypoxia on gene expression analysis. 

03:35 – 04:13, 

Slide 3 During this talk, we’ll see different techniques I need to identify as well as to isolate the rare stem cells and their separate relations. In fact, CD34 stem cells from the hematopoietic system are the basic study component so far. 

In this slide, you will then see how the hematopoietic stem cells, which are CD34+ are able to self-renew and to differentiate into progenitors of all hematopoietic lineages. These processes are regulated by a large number of cytokines to give rise to red and white cells and platelets. 

04:13 – 04:47, Slide 4

More sensitive measurements of CD34+ events should have an impact on research in a broad range of scientific and clinical disciplines. Include also clarified a number of CD34+ needed for the engraftment. In this slide, we can no-lyse no-wash, flow cytometry technique developed in my lab aimed to study more precisely the absolute number of CD34+ in unlysed whole blood that can be applied to cord blood, peripheral blood, bone marrow and leukapheresis products. 

04:47 – 05:23, Slide 5

As you may know, flow cytometry is a multiparametric technique and multiple cell characteristics can be studied at the single cell level. Some dyes allow us exploring the functions such as in mitochondrial membrane potential. In this slide, we can see how the mitochondrial membrane potential with viable CD34+ cells stained with Rhodamine 123 helps to distinguish the primitive compartment within the hematopoietic stem cells which allow us mitochondrial membrane potential as these cells are mainly quiescent. 

05:23 – 05:56, Slide 6

In addition, CD34+ enriched cells from human bone marrow can be used for additional immunophenotyping experiments. The composition of CD34+ side population differs between bone marrow blood and cord blood. In this slide, we can see highly enriched CD34+ cells that were stained, again, with CD38, a marker classically used to distinguish the more primitive fractions within the human hematopoietic stem cell compartment. 

05:56 – 06:48, Slide 7

In this slide, we can see here representative CD34+ versus DyeCycle Violet measurement in peripheral blood and marrow cells using flow cytometry. DyeCycle Violet is a DNA-selective and cell membrane-permeant stain which effectively subsets an order of CD34+ cell for the definition of different functional properties that can be related to the characterization, resolution and purification of primitive hematopoietic stem cells in combination with a specific useful marker for multicolor flow cytometry measurements. The first red blood on the left displays normal bone marrow and malignant disease of the bone marrow with low CD34+ expression, acute myeloid leukemia cells on the right. 

06:48 – 07:20, Slide 8

Moreover, we can use a series of monoclonal antibodies against CD34+ sepsis. In this slide, we can see CD34+, C38, and DR+ cells that were sorted and stained it with propidium iodide for cell cycle studies. Showing that a large number of cells were in the G0/G1 phase and the rest near the 15% and their proliferation. 

07:20 – 07:37, Slide 9

This slide displays the cell cycle analysis of CD34+, CD34+, DR- sorted cells, showing that almost all cells were quiescent suggesting a more primitive enrichment of hematopoietic stem cells. 

07:37 – 08:00, Slide 10

This slide at the same time, CD34+, CD34-, CD30- and DR- sorted cells, showed that most of cells were more quiescent giving support to the hypothesis that the more primitive compartment is enriched in G0 cells. 

08:00 – 08:35, Slide 11

However, in order to distinguish the G0 from the G1 phase, different studies are needed. In this slide, we can see CD34+ purified cells from peripheral blood that were incubated with Hoechst 33342 for DNA staining and pyronin Y for RNA staining. As expected, most of the cells are distributed in the G0/G1 fraction but pyronin Y dim cells or RNA low cells are enriched in G0 cells. 

08:35 – 09:13, Slide 12

Thus, the lack of specific markers for stem cells make the physical identification and isolation of these compartment difficult. Since hematopoietic stem cells differ in their populating themselves in cell renewal potential, we also studied their telomere lengths by Flow-FISH, showing that CD34+ cells reveal the remarkable telomere length heterogeneity with a hybridization pattern consistent with different classes of human hematopoietic stem cells, suggesting that cells with large telomeres could be enriched in long-term repopulating cells. 

09:13 – 09:31, Slide 13

However, to complicate the issue, there is experimental evidence demonstrating the existence of cells in the CD34- population of human bone marrow and cord blood that can repopulate the bone marrow of immunodeficient mice. 

09:31 – 09:59, Slide 14

Clearly, deciding which marrow cell population to select is a crucial and extremely important issue. Magnetic-activated cell sorting has been used for large scale positive selection of CD34+ cells. The negative fraction is a convenient source for T-cells but it’s also enriched in CD34- stem cells. 

09:59 – 10:37, Slide 15

Hematopoietic stem cells can be purified based on the efflux of fluorescent dyes such Rhodamine 123 and Hoechst 33342. Although expression of ATP-binding cassette transporters has been assumed to contribute to dye efflux, component of the stem cell compartment, little is known about which transporters are expressed or what function they might be conferring. Now with this, we know that CD34+ cells mainly express ABCB1 whereas CD34- stem cells express the ABCG2 transporter. 

10:37 – 11:20, Slide 16

Multidrug resistance has been related to the quiescent drug accumulation using pharmacological assays to directly demonstrate this at the single cell level. We incubated ABCB1, EGFP transfected pools with the fluorescent chemotherapeutic drug, doxorubicin, and visualize it the corresponding fluorescent patterns by confocal microscopy. The representative field is shown here to reflect the inverse relationship between PGP, EGFP overexpression, green fluorescence localized between the plasma membrane, and the supervision accumulation nuclear red fluorescence. 

11:20 – 12:01, Slide 17

In this slide, we can see how these transporters contribute to dye efflux. In our lab, we have developed ABCG2 transfections that can be easily monitored by fluorescent techniques, and of course using flow cytometry. Many drugs are good fluorescent substrates and these rise in substances can intercalate into the DNA and accumulate in the nucleus. This reduces drug accumulation pattern present in the ABCG2 expressing cells can also be reversed by pre-incubation with some reversal agents, such as verapamil or cyclosporine A.

12:01 – 13:03, Slide 18

Flow cytometry is increasingly used for quantitation of cellular fluorescent and drug retention, effect of reversal agents, and for the expression of drug resistance related to cellular surface markers. 

To achieve optimal growth conditions, cells were seeded before the drug were taken retention experiment in multiple difficult plates in medium. Uptake of drug is performed by adding them to the culture medium in the presence or absence either of verapamil or cyclosporine A, or other inhibitors. The concentrations of the fluorescent substrate is previously optimized to allow appropriate electronic signal compensation. After one hour of incubation, cells are washed, fed with drug-free culture medium, and cultured for one hour at 37 degrees centigrade again, in the presence or absence of the reversing agents to evaluate their effects on drug retention and monitor it on a flow cytometer. 

13:03 – 13:49, Slide 19

This slide describes the application of multiple substrates to identify drug resistant cells in the green channels such as Rhodamine-123 or SYTO 13, doxorubicin in the orange channel, or mitoxantrone in the faraway channel. Optimal staining and concentration should optimized and combined with immunophenotyping. Gating of viable cell is based on light scatter parameters and in simultaneous staining with propidium iodide or 7-AAD. Analysis of 10,000 cells per samples is carried out in the fluorescent count log histogram collecting the ultra-fluorescent signal in the first decade.

13:49 – 14:24, Slide 20

Cancer and leukemic cells can simultaneously express different multidrug resistant transporters which can be easily distinguished by combining fluorescent probes and multidrug reversal agents. Cyclosporine A is generally specific for P-glycoprotein and probenecide for ABCC1. By using calcein AM, a molecular probe that can be excluded by these two transporters, different fluorescent profiles can be observed indicating that cells can evade in the cancer treatment consisting of multiple drug combinations.

14:24 – 15:06, Slide 21

As we can see in this slide, flow cytometry can also be used to monitor drug efflux over time. Stable ABCG2 transfections were preloaded with mitoxantrone, a fluorescent drug that can be excited with red lasers showing that specific transporter mutations consisting of a single amino acid substitution are crucial for changes in the velocity of the transporter making cells more resistant to treatment. As you can see here, maximum transport velocity can be easily estimated and this information can be very helpful for clinicians because we’ll have a special impact on a reduced drug efficacy or effect. 

15:06 – 15:46, Slide 22

Multidrug resistance in human leukemia and lymphoma can also be detected by using monoclonal antibodies such MRK-16 and MRK-20. The immature CD34 phenotype is strongly linked with the P-glycoprotein. An important advantage of flow cytometry for the detection of multidrug resistant cells seems to be the possibility to identify multiple subsets of hematopoietic cells, using lineage-specific or differentiation antigens. Those cells can be separated using positive or negative sorting procedures for gene expression analysis. 

15:46 – 16:19, Slide 23

Here, we can see the phylogenetic tree of three human ABC transporter proteins. In this figure, only a part of the human ABC transporter phylogenetic tree is presented showing members of the ABCB, ABCC, ABCG family. The transporters discussed in this webinar are ABCB1 expressed in CD34+ cells and ABCG2 expressed in CD34- side population cells.

16:19 – 16:56, Slide 24

Side population, SP, has become an important hallmark for the definition of your stem cell compartment especially for the detection of stem cells and for their physical isolation by fluorescence-activated cell sorting. SP cells are CD34- and were discovered using ultraviolet excitation based on the efflux of Hoechst 33342 and, more recently, amino acid based it on the efflux of Vybrant DyeCycle Violet stain via DCV has been documented to discriminate side population cells.

16:56 – 17:29, Slide 25

Side population displays a unique low Hoechst fluorescence emission property and Hoechst low cells are described as side population cells by virtue of their typical profiles in red versus blue, Vybrant or dotplots. Moreover, the degree of exclusivity seems to correlate with the differentiation state. In fact, SP cells showing the highest efflux activity are the most primitive in terms of the differentiation potential. 

17:29 – 17:49, Slide 26

ABCG2 is the only one responsible for the formation of the Hoechst 33342 flow cytometric fluorescent profile, and this profile is blocked in the presence of multidrug reversal agents because Hoechst efflux is reduced in the presence of verapamil and other drugs. 

17:49 – 18:41, Slide 27

The side population displays a very interesting chromatic shift in the fluorescence. How the chromatic shift, of course, is explained by changes in the cellular concentration of Hoechst 33342. As Hoechst moves across the cellular membrane because of the ABCG2 activity, the blue and red fluorescent ratio follows a course as shown in typical Hoechst red versus blue dot blots. In addition, sample manipulation heavily affects the side population resolution and multiple efforts should be addressed towards minimizing the sample processing time. Excess time spent on isolation and preparation of cells can result in suboptimal SP analysis. For this reason, samples should be processed immediately avoiding density gradients if possible. 

18:41 – 19:09, Slide 28

There is no doubt that different factors may alter the side population distribution from the tip to the G0/G1 cluster. As I told previously, it is generally accepted that Hoechst low are more primitive side population cells. However, and in many situations, specimens aside for the side population have a lower cell number of SP cells showing heterogeneous profiles. 

19:09 – 20:01, Slide 29

We believe that this method can be difficult for most investigators. First of all, because the ability to discriminate side population cells is based on the differential retention of Hoechst Dye or DyeCycle Violet dye during a functional assay. Second because of the difficulty in setting the right experimental conditions, and third, because the analysis of the acquired data requires an extensive expertise on flow cytometry to accurately detect the side population events. The prevalence of SP cells is very low, 0.05% of normal whole bone marrow in the mouse and 0.09% in normal human samples. However, it should not be difficult to find the side population if the whole procedure is followed as others as we have described.

20:01 – 21:07, Slide 30

Here, you can see an animation showing the whole thing process from zero to two hours, the time incubation needed to unveil the side population tail in human specimens. This also shows the whole taking process from zero to two hours, the time incubation needed to unveil the side population in human specimens. Since mutant ABCB1, another transporter, known or unknown, will be able to efflux Hoechst or DyeCycle Violet. Cells clustering within the side population may be wrongly considered as primitive stem cells especially when such transporters are expressed in cancer and leukemia. Although the identification of the side population is based on Hoechst or DyeCycle Violet staining, alternative assays to examine the side population in cancer samples are needed, such as cross staining with a monoclonal antibody against the external domain of ABCG2. 

21:07 – 21:24, Slide 31 

Although we were initially interested in the therapy application of SP cells, we move forward to try to enlighten their role in cancer. The side population cells exist in tumors and have a key role in drug resistance and, of course, also be able to evade chemotherapy. 

21:24 – 21:50, Slide 32

Then we look at the side population in cancer samples. In normal human bone marrow, the side population is composed of very rare cells which account for the 0.05% for nucleated cells or less. However, in bone marrow preparations from leukemia [subjects], side population cells can be found in large numbers up to the 15% and more. 

21:50 – 22:12, Slide 33

Here, we can see side population pattern of Hoechst efflux in the bone marrow of a [subject] with acute myeloid leukemia diagnosis. The side population contains increased levels of side population cells from high to low fluorescents giving duplicated SP tails and SP profiles. 

22:12 – 22:36, Slide 34

In this slide, we can see the DNA content through high grade resolution cell cycle analysis corresponding to the same [subject] diagnosed with AML. Showing a secondary tetraploid subpopulation as calculated from the linear relationship between DNA content and propidium iodide fluorescent. 

22:36 – 23:12, Slide 35

From our experience, circulating SP cells are usually detectable in peripheral blood, and in this slide, we tend to report the only [subject] we have observed with circulating SP cells in peripheral blood, showing that only 11 SP cells from a total of 100,000 cells in the sample have only expressed a CD34 antigen, suggesting the they could have a key role in sustaining leukemic cells. 

23:12 – 23:39, Slide 36

The mechanism underlying drug resistance are still poorly understood but various stem cells often express higher level of drug resistance protein such as ABCG2 and ABCB1 transporters. Augmented levels of these in cancer stem cells may contribute to the refractoriness of metastatic cancer to chemotherapy and should be considered as new target for drugs.

23:39 – 23:54, Slide 37

Then we went back to our animal models trying to identify side population cells in human prostrate cancers. However, all the efforts to detect side population cells in these tumors were unsuccessful. 

23:54 – 24:12, Slide 38

Surprisingly, the analysis of mice bone marrow cells obtained by flushing their [femora], showed cellular types not present in normal bone marrow. In this slide, we can see the different cell morphology in primary bone marrow cultures. 

24:12 – 24:33, Slide 39

In order to identify these cells, then we prepared cytospins or cytocentrifugation that were incubated with our human anti-cytokeratin make and analyzed by fluorescent cell microscopy, showing that the observed abnormal cells were human infiltrating cells. 

24:33 – 25:00, Slide 40

Moreover, then we resorted to analyze a normal mice bone marrow samples using the side population assay as well as bone marrow cell from xenografts showing that almost, if not all, human infiltrating cells were side population cells, suggesting that SP cells may have a key role in bone marrow infiltration and that bone marrow could be also a reservoir for tumor SP cells. 

25:00 – 25:35, Slide 41

More recently, we are trying to unveil the side population compartment and we are performing a series of experiments aimed to analyze the microRNA expression profile in SP cells. A large number of microRNAs up to 108 seems to be heavily correlated in side population cells and we hope to identify any candidate microRNAs that could play a key role to explain the complex biology of these cells. 

25:35 – 26:11, Slide 42

In addition, we have moved forward more ambitiously to study the side population compartment and we are performing a series of experiments aimed to analyze the gene expression profile in SP cells. A large number of cells up to 15,000 are all expressed in SP cells and we hope to identify and new candidate genes, genes that could play a key role to explain the complex biology of these cells.

In this slide, this display the gene expression profile of primary tumor cells and their hypoxic and normoxic conditions. 

26:11 – 26:53, Slide 43

The next one displays the gene expression profile of side population cells isolated by means of cell sorting from primary cells. For gene expression studies, primary cells are stained with Hoechst as described, and keep in mind that side population cells can be initially detected in low numbers in a small percentage, less than 1% prior to cell sorting experiments. Secondly, cells can be resorted to obtain enriched cell cultures in side population cells up to 80%. For gene expression profiling, comparisons, the same number of side population and non-side population cells should be sorted.

26:53 – 27:19, Slide 44

In this slide, data from two representative experiments are reported as a mean of triplicate determination plus/minus standard deviation. Real-time PCR of mRNA from human associate tumor cells and gliomas using specific primers for ABCG2, SMO and the PTCH in unselect cells is compared with the following slides. 

27:19 – 27:41, Slide 45

Showing the same analysis in the side population isolated from the same cells. ABCG2 was inversely expressed when compared with SMO and the PTCH, then in support to the hypothesis that ABCG2 regulates in small levels.

27:41 – 28:55, Slide 46

Whereas they decline in the expression of PTCH may represent a compensatory mechanism as a result of SMO downregulation. The relative numbers of initially detected SP cells ranged form near zero to the 3% of the PI, propidium iodide-negative cells within the live reagent. Subsequently, cells were sorted to obtain and reach products in SP cells ranging from 40% to 80%. Accordingly, for the enrichment or purification of SP cells, a representative experiment using human membrane associated tumor cells is shown. SP cells were initially detected in low numbers in a small percentage, less than 1% prior to cell-sorting experiments. Isolated SP cells were cultured for two weeks following sorting under the same conditions as in sorted cells giving efficacy 25% of SP cells. Sorted cells were cultured again for two weeks, giving approximately 30% of SP cells and then we resorted to the same experimentation to obtain highly purified concentration of products with SP cells. 

28:55 – 29:35, Slide 47

For competitive experiments using cyclopamine, SP cells were sorted again. Only cell cultures with a minimum of 40% of SP cells were prepared in order to study the effect of cyclopamine and temozolomide in Hoechst uptake. The significant decrease of SP cells was a consequence of adding cyclopamine whereas temozolomide had no effect on Hoechst efflux, indicating that the decrease in the side population cells was not a consequence of the depletion of the side population compartment but also a direct consequence of the molecular mechanisms involved in the affinity of the cyclopamine and Hoechst dye for ABCG2 transporter. 

29:35 – 30:25, Slide 48

Hedgehog signaling plays a role in many processes during embryonic development and remains active in the adult where it is involved in the maintenance of your stem cell compartment, and also on the stem cell populations. Aberrant Hedgehog signaling in some cases can lead to certain forms of cancer, and in the absence of ligand, PTCH represses more preventing the activation of Hedgehog signaling. In the presence of Hedgehog, the inhibitory effects of PTCH on SMO are relieved, and SMO becomes phosphorylated and reenters the nucleus to the induce the transcription of Hedgehog target genes, as you can see in this animation. 

30:25 – 31:28, Slide 49

In summary, PTCH have been identified in Hedgehog signaling cascade, playing a key role in the control of stem cell proliferation. Moreover, aberrant revelation of the Hedgehog pathway has been associated with different cancers. Here, we propose a model for hedgehog deregulation in light of our data with an expected role of ABCG2. SMO activity can be modulated by synthetic molecules, cyclopamine, and endogenous metabolites. Our results suggest that ABCG2 is directly acting as a firewall on SMO and maybe also acting on other potential target involved in Hedgehog signaling cascade preventing their binding in a concentration dependent manner and adding more complexity Hedgehog regulation. 

31:28 – 31:4,0 Slide 50

For additional information and for more details on stem cells use and function on drug resistance analysis, please kindly consult it at the Current Protocols Cytometry. 

31:40 – 31:56, Slide 51

Thank you very much for your attention. If you have any questions and for other questions, please do not hesitate to ask. Thank you very much.

31:56 – 32:22, Slide 52

All right. Thank you so much, Dr. Petriz. Let’s go ahead to the question and answer segment. If you haven’t yet submitted a question for Dr. Petriz, now is the time to do so by clicking on the Ask a Question box at the bottom of your screen. Let’s see what questions have come in so far. Starting out here, one question. Dr. Petriz, are you ready?

Yes, I'm ready.

32:22 - 33:02, Question 1

Great. First question. Which ABC transporter inhibitor do you prefer to use for side population?

Yes. I prefer the fumitremorgin which is well-reported and a lot of publications in the literature are using this inhibitor. In addition, verapamil is one of the preferred inhibitors, but in my opinion, the better inhibitor is fumitremorgin. Verapamil is also a  good inhibitor for the P-glycoprotein. 

33:02 – 33:50, Question 2

Okay. Next question is, is dye exclusion the only technique that can be used to detect SP cells?

Yes, dye exclusion is the only available technique in combination with the functional flow cytometry because we are measuring or we are looking at the functional activity of the ABCG2 transporter and this is fundamental. We need to use a functional flow cytometry to identify the side population cells, in addition, we can combine this functional study with a monoclonal antibody against an external domain of the ABCG2 transporter to confirm that we are looking at true side population cells. 

33:50 - 35:38, Question 3

This next question actually has two parts, I’ll read the whole thing and you can answer it how you like. Can we assess the efficacy of drugs on CSC activity and how can you differentiate between CSC and normal SC? 

Yes. This is a very nice and excellent question. It is really difficult to identify stem cells. Research has shown that cancer cells are not all the same. With a malignant tumor or among the circulating cancer cells of a leukemia, there can be a variety of cell types. The stem cell theory of cancer proposes that among all cancer cells, if you lack the stem cells that they would reproduce themselves and sustain the cancer much like normal stem cells, normal that they would sustain our organs and tissues. This is very difficult to distinguish at these different populations, and I think that we should look at the tissue that where these cells are originated and then we need to look at specific markers because these cells can be very plastic and that can express different markers. We have very nice results that confirm this which was done with side populations from brain showing that they can simultaneously express primitive markers in combination with markers from major neurons.

35:38 - 37:08, Question 4

The next question we have, is there any role for notch in the regulation of ABCG2 expression and activation?

Yes. We have preliminary data and I didn’t show these slides because we have the suspicion that the ABCG2 is also regulating different signaling pathways the are crucial for the maintenance of the stemness of the very primitive progenitors. In addition, we have a very nice data about gene expression analysis showing that the most primitive progenitors express high levels of multidrug transporters and we believe that the most primitive stem cell is associated with a classical overexpression of multidrug transporters. During differentiation, these cells will lose and they will have a lack on the expression of these multidrug transporters with the exception of a minimal amount of major cell such as the hematopoietic system, the T-cell. Side population can express P-glycoprotein, and NK cells can also express multidrug resistant transporters.

37:08 - 38:44, Question 5

Okay. Onto the next question then, have other ABCB transporters been linked specifically to tumor stem cells?

I think that there are, in the near future, I think that we will be able to identify new multidrug transporters which are associated with the stem cell phenotype but also with the cancer stem or leukemic stem cell phenotype. At the present time, ABCG2 is a very promising marker to identify very primitive progenitors, and we need more data to learn more not about the phenotype because there is a classical or a growing knowledge to study the stem cell compartment based on the functional studies not only the phenotype. I think that most of stem cell researchers will agree that it will be very difficult to associate a combination of markers able to identify the stem cell phenotype. In addition, we are also looking at the normal stem cell phenotype, for instance, for the hematopoietic stem cell system.

38:44 - 39:30, Question 6

Onto the next question, is the side population necessary or sufficient for a cancer stem cell phenotype?

No, because there are some cancerous stem cells which lack the SP phenotype. They have different patterns of markers or a different phenotype. In addition, a different functional activity with a minor degree of overlapping and not always this function or this phenotype is overlapping between different cancer stem cell types.

39:30 - 41:53, Question 7

This next question seems very specific, so we’ll see how it goes. Will there be a significant difference between CD34+, CD24- cells in circulating tumor cells in blood compared to normal circulation of stem cells in blood?

Yes. This is also a very nice question and an excellent question. The main problem is when we look at the circulating tumor cells, is that we are looking at the very, very rare cells. Circulating tumor cells, I think that the concentration of these cells can be very low but if we look at the circulating cancer stem cells, we will have only one cell per liter of blood, and these limits and this is very difficult to truly identify the phenotype of these cells. In addition, I’ve observed for a very different tumors that these cells can express aberrantly different markers, and this is fundamental to study astrocytoma or gliomas which they lack the CD133 cluster of differentiation marker. I believe that it will depend on the origin of these cells and in addition because they can aberrantly express multiple markers. It is difficult to associate a specific stem cell marker with a cancer, because we also lack a specific marker to identify leukemic stem cells or cancer stem cells. I believe that we need to look at the combination of phenotype and function, and go deeper with more functional studies mainly by using multi-parametric flow cytometry.

41:53 - 43:22, Question 8

All right. The next question we have. Have you experienced changes in efflux capacity in the cells if the cells remain in culture for an extended period of time?

Yes. This is also a very nice question. We also have observed that during cell culture, cells undergo a differentiation and they will lack the expression of the multidrug transporters. This is mainly observed under concentrations of oxygen ranging the 21%, this is the atmospheric concentration and if you culture your cells under low oxygen concentrations, you will be able to maintain the SP phenotype but also the activity. If you would like to preserve the maximum activity of the multidrug transporter, you can resort to use very low concentrations of some drugs that will help to maintain the expression of the multidrug transporter. That this is true, there is a decrease on the activity but also on the expression of this multidrug transporter.

43:22 - 44:18, Question 9

Onto the next question. Does ABCG2 transfection drive or confer the SP phenotype?

This is also a very nice question because most of the people during multiple talks or when we perform transfection experiments are asking me about the possibility to transfer the stem cell phenotype, but it is not true. We will transfer or this transfection will drive only the multidrug resistance capacity of these cells, and we need more additional modifications in these cells to confer a stem cell phenotype. 

44:18 - 45:40, Question 10

Onto the next question. Is there any physiological function for ABCG2 in normal stem cells?

Yes. In fact, we believe that the physiological function in very primitive to normal or healthy stem cell is to protect the stemness of this compartment and we believe that these transporters act like a firewall trying to protect a transection signaling aimed to conserve the stemness of these cells. In addition, and this was reported but due and collaborators and published in the Nature Medicine because they were able to identify that the expression of the ABCG2 transporter in very primitive hematopoietic stem cells, it seems to repress the hematopoietic system and this will allow to maintain the stemness of this compartment. In addition, the ABCG2 transporter will protect these cells against xenobiotic compounds or other toxic molecules that could kill these cells.

45:40 - 46:20, Moderator

All right. It is now time to wrap the question and answer session. So, I’d like to let you know that today’s webinar has been recorded and it will be available for viewing in the next few days. We will send you an email with details on how to access the recorded webinar along with the instructions on how to personalize and print a certificate of attendance as well as download the PDF of the slides.

On behalf of today’s speaker, Dr. Jordi Petriz and our sponsor, Thermo Fisher Scientific, we greatly appreciate your attending today’s webinar and we look forward to your attendance at the future events from Current Protocols.

Basics of Multicolor Flow Cytometry Panel Design

With the proliferation of new fluorescent dyes, as well as instruments that can detect 18 or more parameters multicolor flow cytometry has become more popular and more accessible than ever. This webinar will discuss the caveats of good panel design, including:

  • Rules for designing panels
  • Examples and practical application of these rules
  • Controls and standardization
  • Relevance of panel design to new mass cytometry platforms

View the recorded webinar

DNA Content Cell Cycle Analysis Using Flow Cytometry

Find out how careful acquisition and methodical preparation contribute to accurate and consistent DNA analysis. In this webinar we’ll discuss:

  • An overview of the methods and materials for using flow cytometry to determine cell cycle by measuring DNA content
  • Selection of DNA dyes for live cell and fixed cell analysis
  • Tips and tricks for consistent results

View the recorded webinar

Flow Cytometry in Microbiological Research

In recent years the application of flow cytometry in microbiological research has expanded from detection and quantification of organisms to more complex studies including analysis of host-microbe interactions and detailed spatial and temporal analysis of microbial metabolism in different environments. During this webinar we will discuss how the multi-parametric nature of flow cytometry can be applied to microbiology and the advantages of using this application over traditional microbiological methods.

View the recorded webinar

Basics of Flow Cytometry, Part I: Gating and Data Analysis

An introduction to fluorescence, this presentation covers the following topics: the principle of the flow cytometry platform, the basic components of a flow cytometer, how to interpret a dye excitation/emission spectrum, how data is displayed, basic gating demonstration, and common statistics and terminology used in flow cytometry.

View the recorded webinar

Basics of Flow Cytometry, Part II: Compensation

This presentation provides an overview of basic fluorochromes used in flow cytometry. Topic includes: the principle of compensation, how to perform compensation, the types of controls recommended and their use, basic strategies for designing a flow experiment, and data presentation.

View the recorded webinar

An Introduction to Flow Cytometric Analysis Using Molecular Probes Reagents,
Part I: Cell Proliferation Analysis

In this free webinar, we will discuss flow cytometric analysis of cell proliferation including CellTrace Violet and CellTrace CSFE, Click-iT EdU, dual pulse labeling with EdU and BrdU, Vybrant DyeCycle stains for live cell cycle, and FxCycle stains for fixed cell cycle analysis.

View the recorded webinar

An Introduction to Flow Cytometric Analysis Using Molecular Probes Reagents,
Part II: Viability, Vitality, & Apoptosis Analysis

In this free webinar, we will discuss flow cytometric analysis of apoptosis and identification of dead cells. Changes introduced by apoptosis can be tested with numerous assays measuring Membrane structure, Mitochondrial function, Metabolism, Caspase activity, Membrane integrity, and DNA fragmentation. We will also discuss dead cell identification using traditional impermeant nucleic acid dyes.

View the recorded webinar

Look for Molecular Probes on:     

Not for resale. Super Bright Polymer Dyes are sold under license from Becton, Dickinson and Company.