Flow Cytometry webinars

Our flow cytometry webinars are designed to give a basic framework for learning experimental setup, compensation, and data analysis. These webinars also provide tips and tricks from our experienced technical support scientists and offer guidance for selecting the best products to help you achieve optimal results.

Recorded flow cytometry webinars

Panel Design for Multi-Parameter Flow Cytometry

The correct controls makes or breaks your multicolor panel. Learn the value of including the correct controls and reagent titration. Understand the importance of fluorescence spillover and finding your fluorescence spectral balance.

00:00:00 – 00:01:18, Slide 1

Moderator: Hello everyone and welcome to today’s broadcast. Experimental design best practice for multicolor flow cytometry presented by Carol Oxford, field application scientist, Thermo Fisher Scientific. I’m Xavier and I will be you moderator for today’s event. Today’s educational web seminar is brought to you by LabRoots and sponsored by Thermo Fisher Scientific. For more information on their sponsor, please visit www.thermofisher.com. Now before we begin, I would like to remind everyone that this event is interactive. We encourage you to participate by submitting as many questions as you want, any time you want during the presentation. To do so, simply type into the “Ask a question” box and click on the send button. We’ll answer as many questions as we have time for at the end of the presentation. Also, please notice that you will be viewing the presentation at the slide window. To enlarge the window, click on the arrows at the top right hand corner of the slide window. If you have trouble seeing or hearing the presentation, click on the support tab sound at the top right at the presentation window, or report your problem by clicking on the answer question box located on the far left of your screen.

00:01:18 - 00:01:55

This presentation is educational and thus offers continuing education credits. Please click on the continuing education credits tab located at the top right of the presentation window and follow the process to obtain your credits. I’d like to now introduce our presenter Ms. Carol Oxford. Carol Oxford is currently a field application scientist with Thermo Fisher Scientific. She has worked with FlowJo LLC and was one of the founding partners of ExCyte, a flow cytometry education company. Ms. Oxford, you may begin your presentation.

00:01:55 - 00:03:03

Thank you so much, Xavier. I really appreciate the opportunity to come talk to you about flow cytometry. I’ve done this for quite some time. I was also the UC Davis Flow Cytometry Core director there for about 25 years. I think flow cytometry is one of the most elegant technologies in science. When I did flow cytometry, multiparameter meant two fluorescent parameters and now the people at the cutting edge of flow cytometry are doing 32 fluorescent parameters. It’s changed quite a bit during that time. I’m really thankful to all the mentors that I had and so now I have a lot of experience and hopefully you’ll find some information here that’s helpful to you as you design your multiparameter panels. I put my email address here on the slides, so that you can contact me after the seminar. If you want any information that I give here. You can also email me about other topics in flow cytometry. I can give you some resources as well in the seminar.

00:03:03 - 00:03:54, Slide 2

Let’s just talk about flow cytometry in general. There are advantages and disadvantages of course to every technology that we use in science. There are a lot of advantages to flow cytometry. One of the things that I like the most about this technology is that we get a lot of information on a per cell basis. We use a population of cells that have a lot of heterogeneity, but we’re doing this on a per cell basis. It’s extremely quantitative. We look for statistics like population frequency and expression level on a large number of cells. We can look at tens of thousands or even millions of cells, in an extremely short period of time. We can generate this data very quickly.

00:03:54 - 00:05:57, Slide 3

Of course there are some limitations of flow cytometry. Despite our best efforts, the cells must be in suspension and those of us that have certainly run a core facility or run our own data, we realized that putting clumps of cells or putting pieces of tissue in a flow cytometer quickly ends that experiment. We spend a lot of time fighting with the flow cytometer. We have to get the cells in suspension. We have to make sure that they’re filtered or we use enzymes to make sure that the cells pass though the flow cytometer. The other limitations are we get no spatial information unless we use a complimentary technology like microscopy or an ImageStream which gives us spatial information about where the cells are. To flow cytometrists, cells are an electronic pulse. We don’t get any information about where the fluorescence is inside the cell.

00:04:50 - 00:05:57, Slide 4

One thing that’s interesting about flow cytometry data is it’s very easy to misunderstand the data. I get this all the time from clients and students who tell me that, I’ve had 30 years of experience, so they want me to look at their data or they want me to analyze their data and what I want to present today is that you don’t have to have 30 years of experience to get good flow cytometry data. This data is not subjective. It’s very important to have the adequate controls to understand a little about physics and a little about antibodies, and to do a very careful analysis. People that are just starting out in flow can do the analysis the same way somebody that has 30 years of experience. It’s just really important and sometimes in fact most of the time, you’ll have more controls than you will have samples. It’s just really important to follow some important steps to get really good flow data and we’ll talk about most of those here.

00:05:57 - 00:06:32, Slide 5

Before you panic, this slide is extremely busy and it has a tremendous amount of applications in flow. I know there were a lot of times in my career that I thought, “We’re pretty well done.” We have a number of applications but we’re kind of stuck. Then fluorescent proteins came to the table. Certain dyes came to the table and probes came to the table. All of a sudden we were off and running again.

00:06:32 – 00:06:42, Slide 6

I’m certainly not going to go over any of these applications in particular, but I wanted to say that if we can develop a fluorescent probe or a fluorescent marker, the list is almost endless in the applications of flow cytometry.

00:06:42–Slide 7

Let’s talk about flow cytometry best practice. These are the things we’re going to talk about today. They don’t encompass all of flow cytometry best practice, but they’re certainly important to panel design.

00:06:59 - 00:07:46

The first thing we’re going to talk about is how do we know the machine is performing optimally? Certainly, the manufacturer of flow cytometry instrumentation is going to develop a protocol, to give you an idea of whether the machine is performing optimally. We’re also going to talk about what you might want to do above and beyond this level of testing to make sure you’re going to get good data. We’re also going to talk about how to choose fluorophores. It’s very important to understand some of the characteristics of fluorophores that might make you choose one over the other and give you an advantage in panel design. We’re certainly going to talk about titration. This is one of the most important and most simple concepts that will give you good data and make panel design much easier.

00:07:46 - 00:08:32

Then of course getting along is important in the work place. It’s important in relationships and it’s incredibly important in panel design. Not all fluorochromes get along and we’ll talk about which ones do, which ones don’t and how you can get them in a panel together. Then of course control issues. Me, well I didn’t think I had control issues, but when you become an expert in multicolor panel design, you’re going to have to have control issues. You’re going to run a lot of controls and you’re going to want to know what each one is for, and you’re going to have a lot of them. We talked about that already and you’re probably going to be sick of how much I talk about them in this talk, so bear with me. You’ll still love flow cytometry by the end of the talk, I do, but we do have a lot of control.

00:08:32 - 00:10:30, Slide 8

One of the most important things to understand of course and when you’re using any machine is what does it do? Now, I’m not going to get into the heart of physics because well, I shouldn’t say I don’t like physics, but I’m not a physicist. I understand the least I need to know about physics, to understand how the machine works. In the top upper left corner of the slide there, you’ll see what happens when you open the machine. The Attune NxT Flow Cytometer, which I represent as the field application scientist. Here you can see the filters of the machine and you can see the lasers and some of the optics. It looks pretty complicated, but it all boils down to the detector. What is the cytometer doing? It’s counting photons. We measure the photons that are excited when the laser excites the fluorochrome and the fluorochrome emits photons. The photons hit the photocathode at the front of the PMT. Those electrons that are now knocked off that cathode go through a dynode cascade and they’re amplified. They then are emitted off the other side of the PMT and we can measure those with an electronic pulse, very, very straight forward. All you need to know is that. We quantify the photons, we measure the electronic pulse and there you go. If you know a little bit about physics and a little bit about photons, you’re now a flow cytometry expert.

00:10:30 – 00:13:33, Slide 9

When we started in flow cytometry we used analog instruments. We didn’t know a lot about PMT’s or at least I didn’t. We had a four-decade scale to graph our data. We could walk up to an instrument, we could take an unstained sample, put it in that magic place, that we know is no longer magic, that’s first log decade on our scale. We knew that most of our fluorochromes, most of our samples would fit in that space and maybe sometimes we had to turn the voltage down, but otherwise we got pretty good data. Then we started using analog to digital hybrid machines and then fully digital integuments. We saw all sorts of things happening. We saw that if we crank the voltage up, there were populations appearing that we didn’t see before. We saw our background increase sometimes, we saw all sorts of things. We realized that we needed to understand a lot more about PMTs. We needed to set voltages that were appropriate for our experiments. We needed to do some optimization that’s now been called either a voltage walk or a voltration. Voltration sounds really cool. If you print it on a t-shirt it makes you very popular at parties. I’m just kidding. I also find my jokes really humorous. What does a voltration do? It assures the best signal to noise detection which is what we want to do. Our data is important to know the difference between background and negative and positive. We also want to assure all the measurements are being made in the linear range of the detector. Before when we only had four decades and now we have instruments with five decades, six decades and seven decades of range. We need to know how many decades of real data we get from the electronics. So decreasing the voltage below the minimum voltage that we find in a voltration, the resolution of dim populations may be lost. Increasing the voltage for a given PMT above this voltage gives no advantage in population resolution. I get a question all the time also from customers, why not just put a sample on and increase the voltage until the positive is almost off scale? Which is really what we first did. If a low voltage is a problem because we’re detecting the noise of the instrument, why not just crank the voltage up? Then we realized that that instrument noise can also become a problem and we get spill over spreading from the machine. I’ll explain this as well.

00:13:33 – 00:13:44, Slide 10

This is one way to do a voltage optimization. We use here our UltraComp beads and I’ll talk about those in a minute.

00:13:44 - 00:14:17, Slide 11

Basically, they’re antibody capture beads and they contain a binding bead and a non-binding bead. These beads bind to the variable region of your antibody. So you can put your antibody in the tube, put the beads in the tube and they have two populations. They bind to the variable region, so here you can see at the PMT voltage of one volt which is basically off. We haven’t applied enough voltage to the PMT to get a signal. Here you can look that there’s basically only one population.

00:14:17 – 00:15:00, Slide 12-18

Both of those beads are not being excited enough to see and we start walking the voltage up, very straight forward. As we get two populations to appear and then the beads start to go off scale. We lower that voltage difference to about 25 volts. The detector here if you can’t see the slides very well, the detector is the BL1 detector on the Attune and that is excited by the blue laser and it emits in the emission filter there is 530/30, so collecting green excitation off a blue laser.

00:15:00 – 00:15:06, Slide 19

If we do what’s called a concatenation in FlowJo, you can look here and see some interesting things about the data.

00:15:06 - 00:15:40, Slide 21

You can see that down at the low end, at 250 and 300 volts, the data is not linear. You can see the data starts to become linear when we hit 350 volts. We know we want to be above that range. You can also see that maybe somewhere around 375 volts, that background starts to spread. That’s going to become a huge issue when we start using multicolor panels.

00:15:40 - 00:15:50

Here, how would we decide what voltage to use? I can pick a magic red arrow there and I can tell you that’s what voltage I’ve picked and I’ll tell you why in a minute.

00:15:50 – 00:16:15, Slide 22

Here’s the voltage optimization that’s been done on an Attune NxT and I’ve picked or I’ve walked five PMTs off our blue laser, three off the red and three of the violet. Now you can do this on your machine as well, you walk each PMT individually and this is what you get when you visualize the concatenation in FlowJo.

00:16:15 – 00:18:06, Slide 23-25

What in the world do you do with all that data? There’s a number of things you can do. What we did and we submitted an abstract at CYTO and I can certainly send that to you if you want. It will also be published in our BioProbes and I can send you the link to that as well on our website. Here is the data, looking again at that BL1 PMT and also the YL4 PMT. The YL4 PMT is yellow excitation and the channel that you would – sorry, I don’t know the filter there. That’s the excitation of PE/Cy7, but that’s long red emission off the yellow laser. We looked at several metrics here, the stain index which I’ll explain in a minute. I’ll give you the formula for that. You can Google Marty Bigos and stain index for the publication on that. We also looked at separation index, we looked at robust CV of the positive signal. We also looked at two times the standard deviation of the negative signal. There are a lot of metrics you can look at here. Basically, what you want to see is you can look at where the robust CV starts to decline and flatten out. Then you can also look at where the peak of the stain index is and where that starts to flatten out. Here for the YL4 PMT for example, that starts to happen at about 400. These are some various methods and metrics you can look at for the voltage optimization. If you look at these graphs, you can certainly argue that there are several voltages that might work and that’s a valid argument.

00:18:06 - 00:18:18

The next step in this process is to take your data, run them at voltages here a minimum voltage, say 400, 375, 425 and see how this affects your data. Look at some data with low expressing markers and see if it makes a difference. The real point in this optimization is to find a minimum baseline voltage. You may adjust those voltages up or down but probably not. This is a good place to starts every experiment.

00:18:18 - 00:20:18, Slide 26

Here’s another option for your voltage optimization. Here you have a hard-dyed bead. The bead that I use for this optimization is the peak 2 of the Spherotech 8-peak bead set. This is a hard-dyed bead, it is a single peak so I don’t have two peaks here that I have to set a gate on and I did some very simple metrics. I used the robust CV and the robust standard deviation and I looked at those graphs simply in Excel and I looked at the range. I didn’t look at the inflection point, that’s also an interesting point, but I looked basically at robust CV where that leveled off. I looked at where the robust standard deviation started to rise. For example for this PMT, I would probably pick a voltage of about 350 for my minimum baseline voltage. The reason I would pick that voltage is I get a good flat CV so I know increasing the voltage there would not give me an advantage and I have a very low contribution of the electric noise measured by the standard deviation. I’ve done this on several occasions, on multiple occasions with multiple machines. I’ve also run data here and that suggests that I get a very good signal and I also get a good data here.

00:20:18 - 00:21:26

The next step that I do in this optimization because I take another hard-dyed bead for example the peak 5 bead and I walk my instrument for all PMTs doing this optimization. When I find those voltages, I make a histogram with all of my PMTs. Then I run the voltages from the optimization and I run my slightly brighter bead, my peak 5 bead at all of these voltages and I set a gate. I run this every day and I use the voltages every day for my experiments. That way, if my instrument changes, if anything changes in my experiment, I can say to the FDA or to the reviewer of a paper, that I have measured the same fluorescence every day. This is a great way to not only make sure that your instrument is characterized, but make sure that you are characterizing your experiment as well.

00:21:26 – 00:21:44, Slide 27-29

Let’s review the voltage optimization. We want to optimize our PMT voltages, so we have a minimum voltage where we are always detecting a voltage or detecting our cells and our signal above background.

00:21:44 – 00:22:17, Slide 30

If we do need to go below this minimum and why would that ever happen? If I put on my cells and I realized that now my positive signal is off scale. If I haven’t titrated or maybe I’m using GFP and I just can’t get that signal on the scale, it’s most important that you have to get it on scale, so you’re going to lower your voltage. If I have to do that, I know that I may be sacrificing signal to noise, but at least I have an understanding of that.

00:22:17 – 00:22:22, Slide 31

With GFP if I’m only looking at something bright, signal to noise may not be my most important consideration.

00:22:22 – 00:22:24, Slide 32

If I raise the voltage, there’s no advantage to resolution.

00:22:24 – 00:22:32, Slide 33

I don’t want to change these just for resolution and optimization is very important so just get it done.

00:22:32 - 00:22:41, Slide 34

If I do this on an Attune NxT with the hard-dyed beads, it takes me about an hour. I throw the numbers into Excel and it’s very easy.

00:22:41 – 00:24:36, Slide 35

Now we have our flow cytometer ready, what’s next? There are many ways to do panel design. There’s an excellent paper by Yolanda Mahnke and Mario Roederer. I believe it’s from 2010, but if you search panel design in those two names M-A-N-H-K-E and R-O-E-D-E-R-E-R, you can find the paper and it talks about ranking your markers. You want to choose the markers which best answer your research question. They rank them primary, secondary and tertiary. I find that very helpful. It’s not engraved in stone, but it’s very helpful because again I use this little rhyme, “The more colors you use, the more sensitivity you lose.” It’s counterintuitive but the more fluorochromes you add to a panel, you encounter a concept called background expansion because of spill over spreading. This is a problem of physics, it’s not something that compensation corrects for, not something we can avoid. If you can answer the question with less markers, it’s always an advantage to do so, but of course nobody wants to do that, right? We want to use more markers and we’ll talk about how to do that. You’re going to want to titrate all your reagents, you’re going to want to screen for the best ones. What are the best ones? Of course they’re the ones that separate the population best, but also sometimes the same reagent conjugated to a different fluorochrome is going to give you a different staining pattern. You want to test all the combinations that you can, you want to troubleshoot issues with sensitivity, I’ll give you some tips there and then you generate your data.

00:24:36 – 00:26:08, Slide 36

Of course, this goes back to high school science, or maybe even elementary school science. We determine a biological hypothesis. Which antibodies do we need for markers of interest? You’re going to do a literature search and there’s a really helpful source that you can do this, they’re called OMIPs. I’ll talk about that in a minute and show you where you can find them. The most important thing is the panel design is iterative. Predictive panels are exactly that, they’re predictive panels. No matter how much expertise you have I can give you a predictive panel, but I can’t tell you if it’s going to work or not with any confidence. I used to teach week long courses in flow cytometry and every single class there’d be people disappointed at the end that I couldn’t hand them a panel that worked. I certainly can hand you panels that I’ve done. I can give you panels in the literature that other people have done, but they have a successful chance of working, but you are not going to do it - unless you want to repeat their work. You’re not going to do the same thing on the same machine with the same cells. You’re certainly going to have to do this in an iterative manner. The idea is that you’re going to do a backbone panel that always works and then you’re going to add antibodies and add markers and see if that backbone works until you can add as many markers as you need.

00:26:08 - 00:27:48, Slide 37

There is a lot of help out there. The flow cytometry community is really, really collaborative and OMIPs were designed or actually introduced by the reviewers of the journal Cytometry. I think there are more than 50 now. These are open source and they were designed because no matter where you publish a paper, you can never get all the information that you need from the journal’s method section. Let’s say I want to publish a paper in nature, “Carol’s T-cell killed cancer.” Unfortunately, you can never get all of the important details out in the materials and methods section. I can get another paper, a two-page paper called an OMIP, Optimized Multicolor Immunophenotyping Panel and I can get that or I can publish that in Cytometry. This gives all of the detail that flow cytometry nerds want to see, right? You can see important things like the gating strategy, the clones, the panel design. Did I have any difficulty with certain antibody combinations? Here you can see in the gating strategy, I don’t use a forward and side scatter gate to determine any of my cells of interest, I just use it as a debris elimination gate. I gate first on my doublets. I gate second on my dump channel and viability dye. I use different types of plots. I use maybe a plot that shows me some back gating that I did. It’s really, really critical, you can use these as reference.

00:27:48 – 00:28:40, Slide 38

We also keep track of these on our website. You can look at our flow cytometry resource library which has a tremendous number of resources for both people that are new to flow cytometry and people that are very experienced. We have flow cytometry application notes, techniques. We have videos so that if you are new to flow cytometry, you can figure out how a flow cytometer works. We also keep track of all the OMIPs here because if you Google them, sometimes only a few of them come up. I really appreciate them because when I studied T-cells, they were just activated or producing cytokines. Now, people ask me about panels for T-cell exhaustion. They weren’t exhausted when I was studying them. You guys have the hard work to do, right? The old people in the field, we did the easy work.

00:28:40 – 00:28:54, Slide 39

These will really help you with your research. I talked about the stain index. This is the formula for stain index, basically the median of the positive minus the median of the negative, over two times the standard deviation.

00:28:54 – 00:29:00, Slide 40

It’s a metric that allows us to look at the separation of our population. There are many of them out there, but this is one that we would commonly use.

00:29:00 – 00:30:05, Slide 41

Antibody titration, people always ask me what’s the best way to titrate an antibody. Any way you want to, right? There are a tremendous amount of protocols out there for this. one of them that I like is by Ryan Duggan, if you Google Ryan Duggan and titration and Alexa Fluor 488. He has a great protocol that actually tells you the amount of PBS to put in a well when you’re titrating your antibodies. This is a titration that was done by one of our scientists. It looks at serial delusions of a one to two concentration of antibodies and then the graphing of the stain index. This is where there’s a lot of mythology about antibody concentration. Here’s the graph of the stain index that you can see and when I was new to Flow.

00:30:05 - 00:30:09, Slide 42

I would have said, “I would look at this and pick one of the higher concentrations of antibodies where that stain index flattens out, because I want to be at saturation, which might be the concentration here.

00:30:09 - 00:30:53, Slide 43

Then now, I’ve learned that you can use a concentration here. There’s a difference between a saturating titer and a staining titer. The one thing you’re trying to do here or maybe you’re trying to do is to make sure that in the population that you’re trying to detect, you’re detecting the percent positive. Now if you’re trying to see the population shift or the MFI shift, you’re going to want to use the saturation titer, that’s a different case. Most of us are just trying to say I have 20% of these or 50% of those.”

00:30:53 – 00:30:58, Slide 44

In that case, you may want to use a separating concentration.

00:30:58 - 00:31:56, Slide 45

The saturating concentration, if you increase the antibody, you only increase the non-specific background. People always ask me, “Why can’t you just wash more?” Washing does not eliminate nonspecific background, at least it doesn’t very well. Washing is a great way to lose cells. You usually lose about half of them in every wash. It does eliminate excess fluorochrome, but that’s really not the important thing that you want to do when you’re titrating and staining. A separating concentration gives you good separation of your cells. It reduces your spreading error and it saves you antibodies. You can often use the separating concentration when you’re only looking at the percent positive, which is really what the vast majority of us do, when we’re doing immunophenotyping.

00:31:56 – 00:34:44, Slide 46-47

Let’s talk about spill over, compensation and spread. These terms are thrown around a lot in flow cytometry. They’re sort of the language that flowers speak. You’ll also hear the term spillover spreading. Bear with me a minute as I talk about something that’s not on the slide here. When I talk about compensation I often use FITC because it’s a fluorochrome that we’re all really familiar with. When I ask what color is FITC, you say green. I just set you up for failure because that’s a trick question. FITC is not green. We think FITC is green because we’ve been scammed. FITC is green or we think it’s green because we look at it under a microscope and we have a filter on that microscope that only allows those really energetic green photons back to our eye. I feel like making a Netflix type documentary. The truth about FITC is that if we didn’t have that filter, the emission spectra and here you’ll see the emission spectra of PE/Cy5.5 on your screen – I’m sorry, maybe it’s PerCP-Cy5.5, yes it is. An emission spectra is a probability curve, so going back to FITC, if I hit FITC with a blue laser, the chances of FITC or the probability of FITC emitting green photons is very, very high. FITC also has a very good probability of emitting yellow photons or orange photons or even red photons. I have a flow cytometer that very well detects yellow, orange and red photons. It’s very easy for me to determine where those photons are coming from if I have a single stain for every color that I use in my experiment. We now use compensation which is an algorithm to separate the emission of these fluorochromes and visualize them in multi-dimensional space. That sounds really tricky and the math might be too complicated for me to do in my head, but it’s very, very easy in practicality.

00:34:44 - 00:34:54, Slide 48

Here is two fluorochromes that we used PerCP-Cy5.5 and APC Alexa 700 and here’s the example of where the photons are that are admitted by one and detected in a neighboring PMT.

00:34:54 – 00:35:53, Slide 49

When I used a single stain PerCP-Cy5.5 sample, all of the cells or all of the beads whichever this is, are detected in the detector, the primary detector where I want to make a measurement, they’re also detected in the secondary detector where I’m going to eventually make another measurement and this cell gets a number here and it also gets a value in this detector. Compensation just uses an algorithm to force the median of this value on the Y axis to be the median to be equivalent to the median of this - on this axis as well. We could have chosen another algorithm to do this but we chose compensation. Compensation allows us to visualize the effects of spillover in multi-dimensional space.

00:35:53 – 00:36:01, Slide 50-51

Sounds complicated? It’s not. It just uses a vector and the slope of this line to do the calculation.

00:36:01 – 00:36:40, Slide 52-56

Here we have a median of 1,277 and 60.2 – I’m sorry. We have an MFI here on this axis. We use compensation in our single stain sample to adjust these medians to be more equivalent and we have a fully compensated sample.
Now, when this cell gets a value here and this cell gets a value here, I can’t change that value. I can only bring it down to the lower end of the log scale and now I visualize this spread.

00:36:40 - 00:37:01

It’s not visible on this because – it’s not visible here, sorry, because it’s on the high end of log space. It’s much more visible here. This is what is called spread or spillover spread. I don’t really like those terms but we’re kind of stuck with them. That’s what you see here.

00:37:01 – 00:37:5, Slide 60

This is an important example looking at uncompensated and compensated data. Usually, we can recognize that the plot on the left is uncompensated by the strange diagonal and on the right, we only know the data is compensated because it’s marked uncompensated. We really don’t have any way to statistically determine that those medians are equivalent in the compensation control. It’s important to know that we’ve used the right compensation controls and we don’t know on the left that the population that is dimly stained for CD11b and brightly stained for CD11c is indeed double positive. We want to make sure and use the right compensation controls to make sure we have good data.

00:37:5 – 00:39:22, Slide 63

The process of compensation is very easy even though the math is kind of complex. Compensation beads are really the easiest way to do compensation. They work for all lasers and depending on the manufacturer, they work for every antibody. The beads that we carry, the E-bio UltraComp eBeads work for mouse, rat or hamster, the AbC beads also work for rabbit antibodies. The features that are nice about the beads, the unstained bead fluorescence is very similar to most cells and they give you a bright stain and a clear separation. The rules of compensation of course are that you have to have a single stain for each flour in your sample or in your panel. You need to have a negative and a positive either in the tube or in the algorithm, so you can use an unstained. It used to be important that you had the samples as bright or brighter but when I talk to experts in the field like Dave Parks who writes the compensation algorithms in the software, he said the algorithms should calculate that math for you. I think it’s still probably important to have a very bright and well separated population and the beads do this for you.

00:39:22 - 00:39:40

It’s helpful certainly if the target is not highly expressed on your cells or expressed by only a few cells. For example, if you’re using a cytokine or FOXP3 or something that’s not well expressed, these compensation beads work really, really well.

00:39:40 – 00:39:57, Slide 76

Now that we’ve got our compensation controls done, our instrument voltages are set, we want to talk about some basic strategy for fluorochrome selection. We wanted to save the brightest fluorochromes for our dimmest markers.

00:39:57 - 00:40:59

How do we know that our markers are dim or bright or if our fluorochromes are dim or bright? I’ll talk about that on the next slide. We want to save our important targets which we’ve ranked with the paper that I discussed. We want to save these for the brightest fluorochromes. APC, PE, Alexa dyes, Super Brights - these are the worst resolves. They have low expression or poor antigen access or they have an unknown expression level. If I want to look at antigen X, I may not know what the expression level is. Then, I want to minimize the spillover. I think about this as part A and part B of panel design. I like to have a spillover spreading chart that’s part B and then I like to have a stain index chart, that’s part A. I want to look at stain index and spillover spreading. Let’s go to the stain index slide.

00:40:59 – 00:42:34, Slide 61

Here’s an example of the ranking of fluorochromes by stain index. This is one that we did at Thermo Fisher. We took a freshly isolated PBMC and stained them with anti-human CD4 conjugated to various fluorochromes. Now this of course will vary manufacturer to manufacturer, it will vary instrument to instrument. What we’ve listed here is the fluorochrome, the laser, the filter and the stain index that we got with the Attune NxT Flow Cytometer. You can certainly use this on your instrument. These things will be relative instrument to instrument. It is best to use it on your instrument and certainly different manufacturers will have this for their instruments and their fluorochromes. I like to have this print out in front of me so I can see the brightness of fluorophores. This also depends on several things. Fluorophores are bright because they’re quantum efficient. For example, FITC is very quantum efficient. When you hit FITC with a lot of light, you get a lot of photons but FITC unfortunately can be a problem if you’re looking at an auto-fluorescent cell line because you also have, of course, auto-fluorescent and auto-fluorescent protein, auto-fluorescent proteins emit in the green. The brightness of this only accounts for the stain index and not necessarily for the auto-fluorescence. That’s another thing you we want to consider but this is a great chart to have as you’re doing panel design.

00:42:34 – 00:43:40, Slide 77

Here we’ll start up talking about spillover spreading. We talked a little bit about looking at fluorochromes that have emission spectra that overlap. On the left, you’ll see an example of CD3APC Alexa Fluor 780 and CD56 APC. You can see that those emission spectra overlap a bit and if you use these on cells that are co-expressing the same markers, you can certainly see that the pattern that you get even if you use an FMO, it would be very difficult to see how to quantitate those populations. If we separate those co-express markers on to fluors like eFluor 450 and APC, we can see a much better staining pattern that gives us a much better idea of how those cells are co-expressed.

00:43:40 – 00:43:50, Slide 78

This is a classic example of two fluorochromes that are used quite often in panel design.

00:43:50 – 00:44:41, Slide 79

Here’s an example of PE and PerCP-CY5.5. PE is a very, very bright dye and it’s often that manufacturers when they come out with a new antibody, the first thing they do is conjugate it to PE and I’ll tell you that PE is possibly the worst place to make a sensitive measurement in your cytometer. Now PE is very bright, we use it all the time. PE is also the place that accepts a lot of error from the other fluorochromes that you use on your cytometer and I’ll show you why. Here we have a PE single stain and we’re measuring that in PE and in PerCP-CY5.5.

00:44:41 – 00:45:01, Slide 80

Again, as each cell goes through the cytometer, it gets a value in every detector. When we compensate that data, you can see the spread of this data. Compensation lets us visualize the spillover and the spread.

00:45:01 – 00:45:17, Slide 81

If we used a gate on the unstained population, obviously it looks like there’s a population of double stained cells on the right. If those aren’t double stains we can certainly see that, we would have to use an FMO but these cells are definitely there.

00:45:17 - 00:45:52, Slide 82-83

There’s an easy way to do this in multi-color space. This is a neat trick. Here instead of thinking about how bright our stain is which is part one of panel design the stain index, now we’re going to think about if we have to have, for that Nobel prize, or that paper in nature, we have to have a PerCP-CY5.5 or a FITC antibody and here’s the FITC antibody I absolutely have to have, and here I’m looking at that antibody in FITC and in PerCP-CY5.5.

00:45:52 – 00:46:40, Slide 84

Now, I add more antibodies to this panel, I’ve added that the two antibodies that I cannot live without and I’ve added APC-Cy7, BV421, PE/CY7 and the other antibodies that I have to have in my panel. Now, with all of these dyes in the panel, I can still use the two antibodies I need without any problems. You can see the background expands a bit from spreading of the other fluors but no matter where I set a gate, I can easily determine the population percentage of those two dyes.

00:46:40 – 00:47:30, Slide 85

Now, I add PE BV605 and BV711 and now I have a problem. With the two antibodies I’m using, the FITC and the PerCP-CY5.5, if the staining is as bright as it looks here they’re not a problem but if the staining is dim, now I have fluorescence all the way out to the third log in the PerCP-CY5.5 channel. The FITC channel is virtually ineffective. FITC is one of those dyes that plays nicely with others. It does not accept error from any of these dyes certainly but the PerCP-CY5.5 channel, there is background expansion and right now all I know is that it’s from PE or BV605 or BV711. Now certainly, the more experience you get with panel design, you’ll know where it’s coming from.

00:47:30 – 00:47:39, Slide 86

This little trick can tell you exactly where it’s coming from.

00:47:39 – 00:47:50, Slide 87

Here, I can look at every single stain sample and I can look at it in my FITC channel and like we saw before, all of these single stain samples look fine when we detect them in the FITC detector.

00:47:50 – 00:48:47, Slide 88

Now, when we look all of the dyes, single stain and concatenated in FlowJo, we can see exactly where the problem is, PE and BV711. When I add these two dyes they contribute quite a bit of background expansion to the PerCP-CY5.5 channel. I know that if I’m going to use an antibody in PerCP-CY5.5, it has to be brighter than that 10^3 because if it’s dimmer than that, I just won’t see it in multicolor space unless I use it on different cells, then I’m staining with the PE and the BV711 and our new Super Bright dyes that also emit off the violet channels would have the same characteristics because they have the same emission or very similar emission spectra.

00:48:47 – 00:49:33, Slide 89

Here for example are all the same dyes in the BV711 detectors, you can see PerCP-CY5.5 is going to be causing some background issues for BV711, and here you can see all the dyes that cause background issues from PE/CY7. Again, PE/CY7 is very bright and often used in multicolor panels but you can see it’s not a place that you would necessarily want to make a dim measurement. These are very important measurements to make and understand which dyes cause contributions and which dyes you might want to either keep separate or save for your brightest markers.

00:49:33 – 00:50:53, Slide 94

Let’s just do a quick spreading error review. These are basically caused by photon counting statistics. I like to think of photons like kids. Basically, we really like energetic kids. We like blue, green, violet photons, they have tremendous amount of energy. When they hit that cathode, they knock off lots of electrons and as scientists, we like lots of numbers, right? We can count those, we get great statistics, we get very low variation. Unfortunately the red photons are like teenagers. I love my children I raised them through their teenage years but they’re a very bad investment. You put lots of energy in, you never get that energy back, right? There’s loss through rotation, translation, heat energy and when they knock off electrons, sometimes they knock off a little, sometimes they knock off a lot and when we count them, there’s just this spread, right? We can still use them. There are advantages to using them but we need to understand that we get the spreading error and there is a way to predict it and there’s a way to use them wisely.

00:50:53 – 00:51:36, Slide 95

Here’s an example of a great paper in Mario Roederer’s Group and this is a calculation of some spillover spreading matrixes. Now, this is another paper that is very complicated and if you have to do the math by hand I don’t envy you. We used to do this on all of our machines in the lab and it took a lot of time but it’s very valuable data. Here you can see where the fluorochromes don’t get along and where you’re going to have issues. These are on LSR II instruments and looking at QDots and some of the other fluorochromes.

00:51:36 – 00:53:13, Slide 96

This is an example of a spillover spreading matrix that I generated in FlowJo with some of the scientists at Thermo Fisher. This is a very easy thing to do in FlowJo. You take the data of a single stained, these are CD4 on an Attune NxT V6 configuration and these numbers give you some warning signs based on the red color. For example, this is VL5 and RL2 on the Attune NxT, this is Super Bright 702 and Alexa 700. These emit in the same channel off of different lasers – I’m sorry, they have a similar excitation spectra off two different lasers, so you can see where they might have an issue. This is just is a little bit of a warning to say that you’re going to have to be careful if you use these two together on the same cell. This is really helpful, you can do it easily with single stained cells, this is with CD4 Super Bright dyes and several different fluorophores on an Attune NxT. This is really helpful and very easy to do. You can do this for your machines, it doesn’t take long, just run the single stained, throw it in FlowJo, and here’s the button you click on for the spillover spreading matrix.

00:53:13 – 00:53:42, Slide 98

We’ll talk a little bit very quickly about staining control. This is in the literature and there’s a lot of great articles here. I won’t go too heavily on them but there’s some very important controls that you need to do when you’re doing multicolor flow cytometry and the number one control that I find people are not using as much as they probably should are fluorescent minus 1 controls. These controls are not compensation controls, they’re gating controls and then we’ll talk a little bit about the rest of the ones on the list here.

00:53:42 – 00:54:21, Slide 100

Fluorescence minus one controls contain all of the markers except the one of interest. If I want to do a FITC FMO control to determine what’s positive and negative for FITC, I want to use all the other fluorochromes in my panel except for FITC. They help us control for the effective spread, they allow delineation of positive from negative and they’re important for low density or smeared populations.

00:54:21 – 00:55:40, Slide 107

Here’s a couple of examples of how a fluorescence minus one control is used in gating for flow cytometry. You can see on the left in both cases, we set a gate on the unstained. On the right, you can certainly see that that gate is not appropriate for the full stain and you want to move the gate up but you certainly can’t convince a reviewer that Carol said I could move the gate up. Unfortunately, that doesn’t get a paper into a journal. I wish it did but if you run the FMO, you can see that that gate definitely defines the population and describes that spillover spread that we see there. Adding gates is not arbitrary and I can certainly not set gates any better with 30 years of experience than you can. I use the proper controls as well. I don’t want to rely on the unstained and certainly people ask what – how many FMOs do I need to run for every experiment. What I do is if I’m doing a 14-color experiment, I’m going to do 14 FMOs with that first experiment.

00:55:40 - 00:56:56

Here for example, in the first panel of plots that you see, the PE/CY7 and the CD45 PerCP-CY5.5, I would not need to use those two FMOs after the first time I run them because no matter where I set those gates, I can get a good read on the separation between those two populations. Once I’ve set the FMO and I realize where I need to set it, I can tell a reviewer, “I ran these FMOs the first time, I got good separation, and I didn’t have to run them the next time.” In the bottom two panels, I probably would not need to use the CD8 FMO but I would certainly need to use the FMO for PE because there, the antigen is expressed on a continuum and I would really need to set that FMO every time because moving that gate very slightly in either direction is going to change the value significantly. Here again, also I probably wouldn’t even use quadrants for this gate. I would probably set them as squares because quadrants really don’t describe flow data very well.

00:56:56 – 00:58:09, Slide 108

Here again, you have controls based on what you’re going to do to your cells. If you make changes or if you treat cells with radiation or you stimulate them or you kill them or there’s something that you do to your cells that create a change in your cells, there may be changes that affect your fluorescence, that affect your scatter, that affect the stickiness. You want to do as many controls as you can especially in the beginning of an experiment so that you can see any of these changes that might occur. Isotype controls are very controversial in flow, they’re not often used anymore, there’s a tremendous amount of literature out there on them. If you have to use an Isotype control, you want to throw it into your FMO at the titration that you use, your antibody and really the only thing that an Isotype shows you is that you have a titration issue. It shows you that you have a non-specific binding problem. It’s really something that you might want to consider only using in your FMO control.

00:58:09 – 00:58:45, Slide 109

Viability is very important to eliminate unwanted cell. Dead cells are sticky. It’s very important to use a viability control in every sample. Dump channel, not necessary but often very helpful. You can eliminate unwanted cells from analysis. This is being very helpful in things like stem cell analysis. Stem cells are very rare. We want to make sure we look at undifferentiated cells.

00:58:45 - 00:58:48, Slide 110

You can add differentiation markers all in the same channel, you can use them in the same channel as your viability dye and gate them out.

00:58:48 – 00:59:29, Slide 111

Here’s a slide that shows the amine-reactive dyes for dead cell identification. We have eight different options at Thermo Fisher. These are great. We used to use EMA which required ultraviolet light fixations and when these came out, these were fantastic. Live cells have some amine, dead cells have many more. So you can take these dyes, slot them into a place in your panel where you don’t have another re-agent, gate them out, we have AbC amine reactive compensation beads so they’re really fantastic for use in your panel.

00:59:29 – 00:59:49, Slide 112

We also have a number of impermeant nucleic acid dyes for dead cell identification. These are very bright, coming in a number of colors that you can also slot into your panel. These are used for live cells, the others can be also – the amine reactive dyes can also be used in live cells or in fixed cells.

00:59:49 – 01:00:49, Slide 113

Here’s an example of the importance of using a viability indicator. Forward and scatter, forward and side scatter we know are not to be used for dead cell indication because they are not a good indicator of viability. If you gate here on the upper left panel and you see a forward and side scattered gate, this is using log side scatter as the Perfetto and Herzenberg and Roederer lab often do. You can see here that the live cells and dead cells that are determined by scatter if you gate them separately, you’ll get a very, very different phenotype and you’ll still have dead cells in the final analysis that look like CD107 positive cells. It’s very important that you gate them out in the beginning using a dead cell indicator.

01:00:49 – 01:01:15-, Slide 115-120

Here’s a slide just looking at the differences that you’ll get when you exclude dead cells, when you use a dump channel and when you exclude doublets. If you’re looking at rare cells or if you’re looking at any cell, it’s really important to exclude the cells that you need to exclude in flow cytometry.

01:01:15 - 01:02:02, Slide 121

In summary, know your instrument, know the configuration, know what your performance test that’s determined by your manufacturer, know what the results give you and how your instruments are performing, do a voltage optimization, know your fluorophores, know their brightness, look at a stain index, look at your compensation and your spillover spreading. Titrate, very important. Then when you put the panel together, look at the spillover spreading, the antigen density and then always, always be a control freak. Use your controls for gating.

01:02:02 – 01:02:58, Slide 122

I’ll leave you with a slide with lots of educational resources. I think it’s very important to have mentors and to have web resources. Believe it or not, I did most of my flow before there was a web, but now we have tremendous resources on the web, we have guided learnings, you can use our School of Fluorescence, LabRoots of course, we have archived webinars that you can use. Some are very basic some are very advanced, we have an Invitrogen Flow Cytometry Panel Builder that you can look at what re-agents to choose for your panels, learning centers, publications. Friends don’t let friends do flow cytometry alone. I appreciate your attention. Thank you very much and if you have any questions, please let me know.

01:02:58 - 01:03:15

Moderator: Thank you Ms. Oxford for your informative presentation. We will now start the Q&A portion of the webinar and if you have a question you’d like to ask, please do so now. Just click on the “ask a question” box located on the far left of your screen. We’ll answer as many questions as we have time for. (01:03:15)

01:03:15 - 01:03:21

Question 1: Our first question is, you mentioned there are a variety of methods for optimization. What are the other methods used and do you have a recommendation?

01:03:21 - 01:04:52

Yes. That’s a good question. We’ve been going back and forth about those methods for probably 20 years now. We just did a study at Thermo Fisher and published a poster about that and I can certainly e-mail the participants that would like that. There are probably also talks about this on the ISAC website, the International Society for the Advancement of Cytometry. My favorite method is the easy one but of course I want to get good data, too. I think the peak-2 method is the one I like, it uses a single bead looking at the standard deviation and the CV but there are a lot of methods out there. There are methods that are published by the group in Mario Roederer’s lab. If you want to take a deep dive certainly - Simon Monard did some talks on this and has published this as well, M-O-N-A-R-D. There’s a lot of data out there but I think my personal favorite is the peak-2 method. The antibody capture bead method is good as well, it just takes a little bit more number crunching.

01:04:52 - 01:06:28

Question 2: What is the difference between compensation spillover and spillover spreading?

That’s a very good question. I really don’t like the terms, spillover, sort of makes you think that there’s so much fluorescence it spills over from one channel into the next and it’s really not. It’s like I spoke about with FITC, you’re really detecting the photons of one fluorochrome in the neighboring detector. Spillover is exactly that you’re detecting the photons of one fluorochrome in an adjacent PMT or it doesn’t even have to be adjacent, you’re just detecting the photons of one fluorophore in a detector where you’re going to make another measurement. Spreading is caused by photon counting errors. That’s just caused because of the nature of photons, if they have low energy, teenagers, then they have a lot of spreading, it’s just physics and compensation helps us visualize that spreading but it doesn’t fix it. Until we get new PMT’s that are better at detecting these photons, there’s just no way around it other than good panel design.

01:06:28 - 01:08:23

Question 3: We can have time for one more question. I don’t have FlowJo. Is there any other way to generate a spreading matrix?

Sure. Yes. You can actually use the papers. There’s two papers actually. The one that I gave in the lecture is certainly available. The math in that paper is a little more complicated than I like to do by hand. There is also a paper – I’m trying to remember it off the top of my head. It’s also a Mario Roederer and a Marty Bigos paper. I think it’s called “Unraveling the immune system 17 color flow cytometry” and it has a bit of an easier math equation that looks at the spreading and you can use that equation to generate a matrix by hand. We certainly did that before FlowJo was able to do the math. Yes, you can do a matrix or you can even take the matrix in either one of those papers and use it as a general guide. Certainly, the value of those numbers will change instrument to instrument but the relationship between the fluorophores is not going to change. So you can look at the ones that are an issue on any machine and they’re going to be an issue on every machine. They may be more or less of an issue but they’re still going be once that you’re going to want to know are going to be a problem and design your panel to begin with knowing that that is going to be an issue.

01:08:23

Moderator: I like to, once again, thank Ms. Oxford for her presentation and before we go, I would like to thank the audience for joining us today and for their interesting questions. Questions we did not have time for today will be addressed by the speaker via the contact information you provided at the time of registration. We would like to thank LabRoots and our sponsor Thermo Fisher Scientific for underwriting today’s educational webcast. This webcast can be viewed on demand through April 2019. LabRoots will alert you via e-mail when it’s available for replay. We encourage you to share that e-mail with your colleagues that may have missed today’s event. Until next time. Goodbye.


How recent technological advances in flow cytometry instrumentation are enabling faster throughput

This webinar presented by J. Paul Robinson at Purdue University, will focus on advances in the Invitrogen Attune NxT Flow Cytometer and outline where the technology within the Attune NxT fits in current and future applications.


Multiparameter Cell Cycle Analysis

Find out how to get better cell recovery and definition of cell cycle compartments. This webinar will focus on multiparametric cell cycle analysis of DNA and specific epitopes using a “washless” staining assay to minimize sample handling.

0:00:00 - 0:00:39, Slide 1

Moderator: Hello, everyone. Welcome to today’s live broadcast, Multiparameter Cell Cycle Analysis, presented by Dr. James Jacobberger, professor emeritus, and director of cytometry and microscopy core, Case Western Reserve University. I am Alexis Cross of LabRoots, and I’ll be your moderator for today’s event. Today’s webinar is part of the protein and cell analysis education series, brought to you by LabRoots and sponsored by Thermo Fisher Scientific. For more information on our sponsor, please visit thermofisher.com.

0:00:39 - 0:01:13

Now, let’s get started. Before we begin, I would like to remind everyone that this event is interactive. We encourage you to participate by submitting as many questions as you want at any time you want during the presentation. To do so, simply type them into the Ask a Question box, and click on the Send button. We’ll answer as many question as we have time for at the end of the presentation. If you have trouble seeing or hearing the presentation, click on the Support tab found at the top right of the presentation window, or report your problem by clicking on the Ask a Question box located on the far left of your screen.

0:01:13 - 0:01:32

This presentation is educational and thus offers continuing education credits. Please click on the Continuing Education Credits tab located at the top right of the presentation window and follow the process to obtain your credit. So, without further ado, Dr. Jacobberger, you may now begin your presentation.

0:01:40 - 0:03:06, Slide 2

Thank you, Alexis. Hello, everyone.
The term cell cycle analysis has come to define cytometric assays to count cells and cell cycle phases, compartments, or states. I’ll use these words throughout the talk interchangeably. This slide shows a single parameter histogram in the upper left for cell stain for DNA content, which gives us three phases, G1, S, and G2+M. A two-parameter plot in the lower left shows cells stained for DNA content plus a mitotic marker which then gives us four phases, G1, S, G2, and M. M is synonymous with the mitotic phase - stages of the cell cycle. G1 are cells with one genome. S are cells that are synthesizing a second genome. G2 and M are cells with two genomes.

On the right, a three-parameter plot shows cells stained for DNA content, a mitotic marker, and two cyclin proteins. These are proteins that oscillate periodically and regulate the cell cycle. These data create a multi-compartment model of contiguous obligate states that cells pass through during the cell cycle. That is multiparametric cell cycle analysis.

0:03:06 - 0:04:55, Slide 3

In general, DNA is the primary base marker. It provides a three-compartment subdivision ordered in time, G1, then S, then G2+M. Periodically expressed genes or activities subdivide these phases further with the goal to create a model with objectively defined compartments that are unambiguous, that is, the cells pass through each compartment unidirectionally once.

This slide shows two parameter histograms of data synthesized from the expression profiles of such periodic expression. The upper row contains expression that rises and plateaus within a single phase. The middle shows expressions that rise across phases. In both rows, cells divide at the plateau level, creating daughter cells that have half that level. In the third and fourth rows, expression rises and declines within a single phase, for example, the first, second, and fourth columns, or across phase boundaries, the third column. These are synthesized patterns based on ideas, but we and others have observed, most of these patterns when looking at a specific marker.

In many cases, the compartments derived from these patterns are not unambiguous. For example, mitotic markers show elevated expression and mark mitotic cells, but the pattern contains cells that are entering and exiting mitosis within the same data space, i.e. that compartment is ambiguous. To render this ambiguous compartment unambiguous, we need additional markers.

0:04:55 - 0:06:18, Slide 4

I now introduce a fixation entertaining protocol that is extensively published, so the specifics are unimportant, other than the lengthy time it takes to prepare cells prior to cytometry. Key elements are formaldehyde to fix the cells and prevent further enzymatic activity, followed by alcoholic denaturation, and permeabilization, or detergent permeabilization without denaturation. Different markers are optimized by variations in this basic protocol. Since markers are intercellular, optimized staining requires time, which is 30 to 90 minutes, depending on the desired data quality, with 90 minutes providing the highest signal-to-noise ratio. Further, after antibody incubation, three washes at 15 minutes each, plus centrifuge time are optimal.

In the next few slides, I’ll present for K562 cells, a human erythroid leukemia cell line, stained for DNA content, (phospho S10) Histone H3, which is a mitotic marker, and two mitotic cyclins, A2 and B1. We’ll walk through the resulting data analysis step by step to create a multi-compartment cell cycle analysis.

0:06:18 - 0:07:40, Slide 5

This slide illustrates the first analysis steps, which clean up the data. These include doublet discrimination, exclusion of any anomalies from perturbed flow, and a correction for signal drift. A region is set that includes singlet cells based on the shape of the primary DNA signal pulse. The first plot shows the signal peak on the Y-axis and the signal area on the X. Doublets, triplets, and so on have peak heights equal to single cells that areas equal to multiples of single cells. The middle row shows plots of DNA content versus time. For this run, there are no areas of perturbed flow, but if there were, regions would be set - used to exclude incorrectly measured data.

However, although here the effect is small, there is a continuous decrease in the signal over time. That can be corrected in the same way as compensation is applied. The quality, that is, the CV of the data, does not change over time, so the correction improves the CV of the corrected data compared to uncorrected data. We examine all parameters for this effect, and corrected the overall effect of that’s severe enough. This doesn’t always happen, and the causes behind it are complex and still under study.

0:07:40 - 0:08:30, Slide 6

Since we are going to move continuously through a multiparameter data space through bivariate windows, I’ve included this slide to illustrate the overall process. What is plotted isn’t important because in the following slides, we will go through the process step by step. What is important is the idea that we have moved through. In this case, five parameter data space, following the arrows in a manner that does not exclude any combination that would equal a cell cycle state, or at least that is the idea. In this example, we start at the box labeled Begin and end where the box labeled End, which is the beginning and end of the cell cycle.

0:08:30 - 0:09:43, Slide 7

The next step is to separate interphase from mitosis. That is shown in the panel on the left. Phospho-Histone H3 and many other heavily phosphorylated epitopes increase dramatically when cells enter mitosis. For this epitope and others, dephosphorylation occurs in late mitosis and continues through immediate early G1. Thus, we can segment all M by gating all 4C cells with elevated phospho-Histone H3.

In these same plots of phospho-Histone H3 versus DNA, we can capture the early G1 cells, shown by the arrow and the word “Newborn.” In these same plots of phospho-Histone H3 versus DNA, we can capture the early G1 cells. We can check that measurement by examining cyclin B1 versus Histone H3. Because of the decreased background of the small early G1 cells, we can separate early G1 from the rare late M cells that have partially dephosphorylated Histone H3. That’s shown in the right panel.

0:09:43 - 0:10:42, Slide 8:

The next steps are to isolate G1 proper, that is, minus early G1. Both G1 and G2 can be segmented using cyclin A2 versus DNA plots with M cells removed by Boolean logic. The remaining cells are in S. The critical G1S and SG2 boundaries can be checked by comparing the G1S and G2 phase fractions of the DNA distribution determined by Gaussian modeling of DNA content with programs such as ModFit to the frequencies of cells determined by color coding logic. Because there is a shape to the cyclin A2 versus DNA S phase component, it can be broken into early S, S1, and the remainder of S, S2. The plot on the far right shows the DNA distribution color coded for G1, S1, S2, and G2.

0:10:42 - 0:11:56, Slide 9:

Next, we segment G1 into two states, G1A and G1B. The importance of segmenting G1 into two states, an uncommitted G1A and committed G1B separates what used to be called the restriction point, which is a commitment checkpoint. The mitotic cyclones are repressed in the uncommitted state via activity of the anaphase promoting complex/cyclosome or the APC/C. As it is inactivated, cyclone B1 expression increases. The level is still low and expression is continuous, therefore, it is not a great marker, but at present, some information can be obtained without adding another marker. The left figure shows how we can use the G1S boundary. S1 is color coded maroon, defined previously, and the early G1 cells here are colored black to determine approximately where to place the region boundaries. The panel on the right shows the progression of the states G1A, G1B, S1, S2, and G2 in terms of cyclin B1 levels.

0:11:56 - 0:15:05, Slide 10:

We next turn to M phase. We define the earliest M state P1 as poor C-cells with maximum levels of cyclins and rising levels of phospho-Histone H3. That is shown in the upper left on the plot of pHH3 versus cyclin A2. The previously defined G2 cells in green define the critical boundary, and the upper boundary is placed at the cluster edge. Thereafter, gating only on M cells, we plot cyclin B1 versus cyclin A2, and the pattern shown in the upper right is segmented at cluster boundaries. In this plot, P1 and P2 are coincident, and P2 is defined by Boolean logic that gates - that excludes P1.

When the APC/C begins to activate, cyclin A2 is degraded, and this is captured as PM because cells in this state correlate with prometaphase. When cells have reached undetectable levels of cyclin A2, the mitotic checkpoint is entered with stable high levels of cyclin B1 and depleted cyclin A2. This state is labeled M because it normally correlates with metaphase. If cells are treated with a mitotic spindle inhibitor, for example, nocodazole, they arrest here and this state will become highly populated.

Next, cyclin B1 is degraded and this correlates with the onset of anaphase, but the rate of decay does not correlate well with the sequence anaphase I, anaphase II, and telophase. Since these states do not correlate well with morphologically defined stages, we label them LM for late mitosis. LM1 is defined by cells with less than maximum but more than minimum cyclin B1, that is, cells that are degrading cyclin B1. LM2 is defined by cells when cyclin B1 has been degraded to a minimum.

Within this state, there are additional states that can be defined by morphology. On a plot of DNA pulse peak versus the DNA integrated signal for phospho-Histone H3-positive cells, we observed two clusters in the transition. The reason for this is that if the cytometer is well-tuned, the late anaphase and telophase cells look like doublets. This is shown in the lower right panel. The cells with the lowest peak signal, that is, those enriched in telophase can be further subdivided into a group with maximum phospho-Histone H3, that is, LM2C, and those with lower levels, LM2D.

The figure on the lower left shows that cells divide from LM2C and LM2D, or at least that is the working hypothesis.

0:15:05 - 0:15:43, Slide 11:

Thus, in a 3D plot, we can visualize most of the 15 unambiguous states, 14 of which are sequentially traversed by unperturbed proliferating cells. An average cell of this population under the proliferative conditions defined by the environment at the time of fixation moves sequentially from G1MB to LM2C, then optionally move to G1MB or LM2D, then to G1MB. This provides a continuous backbone onto which other markers can be mapped.

0:15:43 - 0:16:45, Slide 12:

Now, I’d like to introduce the idea of adding information obtained from a different platform. The data are from a laser scanning cytometer. In this case, these are a different attached cell line, but in principle, the same samples of fixed and stained cells that were analyzed by the flow cytometer previously could be analyzed in this manner.

The plot at the upper left are cells stained for phospho-Histone H3 and DNA, and I have set a gate for mitotic cells. The plot on the lower left shows nuclear size on the Y-axis, and on the X-axis, the density or cyclin B1 in each cell. Region R10 equals the largest nucleus with the minimum cyclin B1 density. Region R15 equals the smallest 4C nucleus with maximum cyclin B1 density.

0:16:45 - 0:17:15, Slide 13

If we look at the R10 images, we observe that cyclin B1 is cytoplasmic and on center zones, but not in the nucleus. Many of the center zones are well separated. At the beginning of mitosis, cyclin B1 first accumulates on the center zone, which then begins to separate and each row end up opposite each other, and create the poles of the mitotic spindle.

0:17:15 - 0:17:34, Slide 14

If we look at the images in R13 to the right of R10, we can see that cyclin B1 has entered the nucleus for approximately 50% of the cells. In those cells in which cyclin B1 is still on the cytoplasm, there are two center zones, and most are well separated.

0:17:34 - 0:17:54, Slide 15

If we go to R15, the cells are almost all on metaphase. Cyclin B1 is now cytoplasmic again because the nuclear membrane has broken down. Although, the cyclin B1 density is still high because the cells have rounded up and cyclin B1 decorates the mitotic spindle.

0:17:54 - 0:19:59, Slide 16

Now, I’d like to take a step back and go over the underlying principle behind the complex bivariate data patterns we’ve been analyzing. This slide shows the expression of two parameters over time. The top row is parameter one and the middle row is parameter two. The data are event cells that vary on the Y-axis, that is, for every point in time, there are multiple cells that represent the expression level if all of the cells in the population were synchronous. The two parameters are correlated in time. When a proliferating population of cells are sampled, because they are asynchronous, all points along the timeline of expression are represented. Thus, if we plot parameter one versus parameter two, we get two-parameter histograms, as shown in the bottom row that look like a typical flow cytometry data.

I’ve color coded the clusters and transitioned bottom row and their corresponding segments in the two top rows. For example, the two stable minima at the beginning and the ends of the time sequence are ochre and brown. They are represented in the bottom bivariate plots in the first cluster in the lower left. The sharply rising expression of parameter one, cyan, is correlated with expression of parameter two at a stable minimum, also cyan. Therefore, the transitional cyan cells move, “to the right in the parameter one direction and remain fixed in the parameter two direction in the bottom of the histogram.” If the expression of one parameter shifted in time relative to the other parameter, the number of self-populating clusters in transition is affected, middle and right columns.

0:19:59 - 0:21:34, Slide 17

Now, going back to our data from the laser scanning cytometer. If we calculate the median fluorescence and plot it on the Y-axis, and plot relative cell cycle time calculated from the compartment frequencies on the X-axis, we can see the profiles underlying the parameters. Here, phospho-Histone H3 expression and cyclin B1 density or any other marker that we have included. If we have imaging data, then we can do the same thing for other information by showing the time-related cumulative change in frequencies.

For example, this plot shows the parameter expression profiles, and reveals the variation and entry or passage through the mitotic stages. It can be seen that accumulation of two center zones parallel prophase, and both correlate with the rise in phospho-Histone H3. Equally, cyclin B1 enters the nucleus rapidly in all of the cells over a very short period, and this is followed immediately by entry to prometaphase. The expression of cyclin B1 through this period is maximal and stable, but the density sharply rises at the end of prophase. Metaphase is more variable, but by the time most of the cells have entered, cyclin B1 density remains high. By the time anaphase begins, cyclin B1 density is falling. Thus, we can quantitatively correlate expression movement of molecules and morphological changes.

0:21:34 - 0:24:38, Slide 18

This is about where we and others have taken multiparametric cell cycle analysis. I’d like to make a few comments, and then I’ll present some of our recent work. Since I can imagine critics saying that, “This is all nice but what real value is this complicated analysis,” I’ll start on the right side of this slide first. The first practical application is in pharmacodynamics. Cell cycle regulators and other targets that affect the cell cycle are still rational approaches in cancer chemotherapy. This type of an assay might provide improved information in pharmacodynamics studies and subsequently as a possible therapeutic guide. We have built an analysis of DNMT1, a target of azacitidine or decitabine therapies using a reduced version of this approach.

Second, for any blood or bone marrow analysis, some subset of this approach could reduce the complexity of data by normalizing expression of other molecules across the cell cycle, possibly reducing the weather data by a factor of two and coincidentally providing high quality doublet discrimination. Finally, my good friend, David Hedley, believes that it is time to use this information to study the effects of drugs in the therapeutic studies of using xenografts. I believe he’s right.

To the left side, these are the things that I would do going forward. G1 can be compartmentalized just as we have done for mitosis. There are many candidate markers, the data will be a little more messy because G1 is more highly variable, but I expect it to break apart in a similar fashion. G2 has a major DNA damage checkpoint operating. This has been extensively studied and the probes are available to see if G2 could be as interesting. I would like to do studies that incorporate data from multiple platforms into a single correlated dataset using the principles I’ve outlined, imaging is my favorite, but slit scan is underexplored, and the work of [George DuCane] and [Sergei Gulnik] demonstrates that differential permeable evasion may be a fruitful way to get at molecular sequestration.

I think the day has come for a new generation of probes, and camel antibodies are my favorite in that regard. Multiparametric data should be analyzed by probability state modeling. See Bruce Bagwell’s worked on this. This would eliminate or reduce significantly the number of decisions about where to set regions. Finally, I think improved hardware is not out of the question.

0:24:38 - 0:26:09, Slide 19

Speaking of improve hardware, I’d like to present some results that I’m very pleased with.

Here’s a reference to a paper by Goddard et al. on acoustic focusing cytometry. This is a technology where acoustic energy is directed into the flow cell in such a way that the center of the string is a volume where the waves cancel and there is an energy absence. Thus, the cells move to the center and once in the center, they stay there. This is illustrated in the next few images by an illustration on the left and a photograph on the right.

Thermo Fisher has commercialized this technology in the Attune NxT Flow Cytometer. What this meant to us is that we could try to develop a semi-washless protocol and reduce the time that we use to stain cells. The idea is to wash away the fixative normally, and then incubate in antibodies in a small volume, then add one large dilution. The Attune instrument can run a large volume so rapidly, up to 1 mil per minute. The instrument can do this because the cell wall move to the center and remain in focus. Thus, the unwashed large volume sample might be able to go directly on the instrument after an incubation period in a dilution.

0:26:09 - 0:26:50, Slide 20

Here is the protocol. Without going through the entire thing, it is just like our standard protocol except we save time on the washes and acquisition time with a time savings of two hours, 5 ½ hours compared to 3 ½ hours. Additionally, in these assays, each centrifugation results in a loss of cells. To the extent that starting with a million cells, usually provides significantly less than 100,000 analyzable events. We have not quantified it, but this washless assay does provide a noticeably better cell recovery. We’ll now look at some comparative data.

0:26:50 - 0:27:15, Slide 21

Here are the data comparing the two assays. They are visually equivalent. The top row compares phospho-Histone H3 versus cyclin A2. The middle row compares phospho-Histone H3 versus cyclin B1. The bottom rows compares cyclin B1 versus cyclin A2 from mitotic cells.

0:27:15 - 0:27:25, Slide 22

We quantified five mitotic states as these are the rarest events. The regions are shown in the bottom row.

0:27:25 - 0:27:50, Slide 23

Here is the result. There’s not a significant difference between the percentages of cells within each compartment or the levels of cyclin A2 and B1 in the cells. Therefore, this is a significant step forward for this kind of work. Since I have made the comments that I would ordinarily say for the summary, I’ll stop here.

0:27:50 - 0:28:15, Slide 24

One more thing, the work shown in this talk depended on the work of many people over many years from my own laboratory and those of many others. The references listed on this slide are reviews of the work or the extensive references to our work and that of others. Thank you for listening.

0:28:15 - 0:28:50, Slide 25

The people who did the work at Case Western Reserve University are Tammy Stefan and Phil Woost on the experimental side, Mike Sramkoski and Allison Kipling on the instrument side. From Thermo Fisher, Suzanne Schloemann ran our initial experiments, and Brian Wortham provided access to the instrument. Jolene Bradford and Mike Ward were critical for moving this project forward, and we continue to work with them on other projects.

0:28:50 - 0:29:17, Slide 26

Moderator: Thank you, Dr. Jacobberger, for your informative presentation. We will now start the live Q&A portion of the webinar. If you have a question you would like to ask, please do so now. Just type them into the Ask a Question box, and click on the Send button. We’ll answer as many questions as we have time for at the end of the presentation. Questions we do not have time for today will be answered by Dr. Jacobberger via email following the presentation.

0:29:17 - 0:30:35Question 1

Our first question is, presumably information increases with the addition of parameters, is the relationship simple? If so, what is the relationship between information and parameter number?

I’ve looked at this in some detail and the short answer is that there’s no simple relationship. I have the feeling that we hit on early on with parameters that were more informative per parameter than I expect to get per parameter in the future. For sure, an additional parameter should add at least one piece of information or it’s redundant. It’s probably not unthinkable that two or three pieces of information might be added per parameter. That’s about all I know about that.

0:30:35 - 0:31:15, Question 2

Our next question is, you have presented that analytical scheme of the backbone with the idea that this is definitely extensible. Have you tested this idea?

The short answer is no, I haven’t. There’s no reason that I can possibly think of that would prevent that from being true. So theoretically, it’s doable, but a real test of it is still – needs to be done.

0:31:15 - 0:33:55, Question 3

It looks like we have time for one more question. Can you give some examples of when a no-wash or minimal wash protocol will be effective?

Yes, I think that - as far as I’m concerned, it’s effective almost always because it saves time, but I think the, perhaps, more important case would be for things like clinical samples that are very limited in cell numbers. So the idea of them not losing cells through centrifugation comes to the forefront, and the idea of incubating with antibodies and then just diluting up and running, and being able to run almost the entire sample is very appealing especially for clinical samples.

We did a study here recently, that was published recently, on the sickle-cell disease and treatment of patients with low-dose decitabine to reactivate fetal hemoglobin. What we were looking for was the decrease in DNMT1 as a function of the decitabine dose. All of the samples had been done prior to us developing the assay, and we didn’t have any input into how the samples were prepared. Those samples were – I don’t remember the exact cell numbers, but it was very few cells, something like a couple of 100,000 in a relatively large volume of fixative. So precious samples that - where we really didn’t want to lose any cells. In that case, the no-wash approach is almost a lifesaver. It’s not that it can’t be done with washing, but I’m certainly much happier with the no-wash solution.

0:33:55 - 0:34:27, Question 4

Thank you again, Dr. Jacobberger. Do you have any final comments for our audience?

No, I hope you enjoyed the seminar – or the webinar. My email is, I think, in the credits or whatever, and I’m happy to answer any questions, if somebody has them, after the fact through email.

0:34:27 - 0:34:40

Moderator: I would like to once again thank Dr. Jacobberger for his presentation. I would also like to thank LabRoots and Thermo Fisher Scientific for making today’s education webcast possible. Questions we did not have time for today will be answered by Dr. Jacobberger via email following this presentation.

0:34:40 - 0:35:03

Before we go, I’d like to remind everyone that today’s webcast will be available for on-demand viewing through May 30th of 2018. You will receive an email from LabRoots alerting you when this webcast is available for replay. We encourage you to share that email with your colleagues who may have missed today’s live event. That’s all for now. Thank you for joining us, and we hope to see you again soon. Goodbye!


Making polychromatic flow cytometry easy after instrument characterization and verification

Join Grace Chojnowski, Flow Cytometry and Imaging Facility Manager at QIMR Berghofer Medical Research Institute as she discusses how instrument standardization can help simplify your flow cytometry.

0:00:0 - 00:00:14, Slide 1

Hello. We’re glad you’ve joined us in this live webinar, “Making Polychromatic Flow Cytometry Easy After Instrument Characterization and Validation”. I am Judy O’Rourke of LabRoots and I’ll be moderating this session.

0:00:14 - 0:00:57

Today’s educational web seminar is presented by LabRoots, the leading scientific social networking website and provider of virtual events and webinars advancing scientific collaboration and learning. It’s brought to you by Thermo Fisher Scientific. Thermo Fisher is a world leader in serving science whose mission is to enable customers to make the world healthier, cleaner, and safer. Thermo Fisher Scientific helps their customers accelerate life sciences research, solve complex analytical challenges, improve patient diagnostics and increase laboratory productivity. For more information, please visit www.thermoscientific.com.

0:00:57 - 0:01:51

Let’s get started. You can post questions to the speaker during the presentation while they freshen your mind. To do so, simply type into the dropdown box located on the far left of you screen labeled “Ask a question” and click on the Send button. Questions will be answered after the presentation. To enlarge the slide window, click on the arrows at the top right-hand corner of the presentation window. If you experience technical problems seeing or hearing the presentation, just click on the support tab found at the top right of the presentation window or report your problem by typing it into the “Answer a question” box located on the far left of your screen. This is an educational webinar and thus offers free continuing education credits. Please click on the “Continuing education credits” tab located at the top right of the presentation window and follow the process of obtaining your credits.

0:01:57 - 0:02:51

I now present today’s speaker, Grace Chojnowski, the Flow Cytometry and Imaging Facility Manager at the QIMR Berghofer Medical Institute in Brisbane, Australia. Grace has a strong instrumentation background and has been managing flow cytometry core facilities for more than 30 years. She came to QIMR Berghofer in 1993 having earlier worked as the flow cytometry facility manager at the Peter MacCallum Cancer Institute in Melbourne. She graduated from RMIT University with a Master’s in Applied Science. Grace is active with the Australasian Cytometry Society organizing courses and meetings and serves on the executive committee. She has served two terms as counselor for the active International Society for Advancement of Cytometry and continues to be active with the society. Grace’s complete bio is found on the LabRoots website. Grace Chojnowski will now begin her presentation.

0:02:51 - 0:03:15

Thank you very much, Judy, and thank you for your kind words and the opportunity to share the work we’ve done here at QIMR Berghofer. We’ve kept for instrument characterization and validations.

0:03:15 - 0:04:13, Slide 2

So, my role at QIMR Berghofer is to try and help to facilitate good cytometry to all our internal academic users, our external academic users and also helps some of our commercial clients and contract work as well as some of the clinical child work we do. We have users who have got a reasonably deep understanding of flow cytometry but then we also have those who become a little bit overwhelmed by the complexity of some of the instrumentation. We have users who prepare a panel and an analyzer then want to do some sorting using the same panel and cannot understand why their populations may look slightly different and why one instrument gives them better separation than others.

So, we embarked on this sort of experiments to see if we can add some of those questions.

0:04:13 - 0:04:41, Slide 4

So, what did we do? Beads are the standard use for flow cytometry performance checking. We know that these beads use different dyes to the ones that are commonly used when performing flow cytometry applications and assays. As such, we wanted to test our instruments response to the same fluorochrome that we use for our day-to-day assays and not just the dyes that are attached to the beads.

0:04:41 - 0:06:00

Another point is that the brightness or staining intakes that’s provided by vendors give us an indication of how bright some these fluorochromes may be, but that is not always the same on our instruments, on my instrument or your instrument. We saw 61 CD4 antibodies with 46 fluorochromes and a number of different clones. It would’ve been ideal to have had the same clone for all the different fluorochromes, but we weren’t able to obtain all 46 in the same clone. Initially, we started using normal healthy volunteer peripheral blood, but as you can imagine, we needed a lot of blood from one volunteer so we could avoid any variability from different multiple donors in one experiment. After a while, we decided to start using comp beads as well. The reason is that comp beads don’t change from day-to-day or individual-to-individual. The concentration is always constant and it’s a lot easier to use comp beads than large volumes of blood from different volunteers.

0:06:00 - 0:06:58, Slide 5

So, polychromatic flow cytometry has grown rapidly over the last five to 10 years. You know, many years ago we had the organic dyes like FITC and some of the phycoproteins like PE and APC, then along came some of the tandem dyes, two-dots and more recently the new Sirigen dyes. This has increased the availability of new dyes along with new instrumentations that’s able to acquire many more parameters simultaneously allowing us to measure so many more antigens simultaneously on one single cell. Having a good handle on instrument characteristics the optical configuration can have a big impact on the ability to resolve many different cell populations.

0:06:58 - 0:07:36, Slide 6

Many vendors that do sell reagents with the different fluorophones will give us an indication of their level of brightness and some vendors will rank these dyes as can be seen in this table, but this ranking maybe different on your instrument or my instrument either because of my optical configuration is different, or it can use different collection optics, my laser, excitation power or wavelength may be different as well.

0:07:36 - 0:07:50, Slide 7

Here we have some fluorochromes that are ranked in order of brightness but also grouped in either high brightness, medium or low.

0:07:50 - 0:08:05, Slide 8

Here, again, we’ve got the rank from very high to very low or also numerically from a five to a one low.

0:08:05 - 0:08:57, Slide 9

Here’s the list of the fluorochromes that we used in our instrument characterization as well as well as the CD4 clone that was associated with that fluorochrome. There were three main vendors that we obtained our antibodies from and I’ll name them A, B, and C. The CD4 Clone SK3 was the most common one used, followed by the RPA-T4, and then a couple of the S3.5 clones. You can also see that with some of the antibodies, the same clone with that with the same fluorochrome was available from a number of different vendors. For example, APC we could get for a number of the vendors.

0:08:57 - 0:10:06, Slide 10

This table shows us the optical configuration of the instruments at our institute which we used for these experiments and which we obtained our individual instrument characteristics on. You can see that some of the instruments have got identical optical configuration such as the Fortessa 1 and 2 are identical. The Fortessa 4 and 5 are also identical, so they have the same lasers, the same dichroics, the same collection filters whereas the other ones may have slight variations with some the bandpass filters and also dichroics. I haven’t listed their dichroic filters or the steering optics here but they may also differ slightly from instruments to instrument.

0:10:06 - 0:11:13, Slide 11

Okay. So antibody titer, all antibodies are tittered for both the normal healthy volunteer or the peripheral blood but we also did compare titers using the comp beads. We expressed the titer as protein in micrograms per mil. We wanted to make sure that the same amount of antibody is present from all the antibodies that we tested and the only difference was the fluorochrome that we were using. We wanted to also make sure that once you have a titer with your peripheral blood. That’s a lot easier to obtain, but with comp beads, we really wanted to make sure that the amount the amount of protein, because of all the binding sites on comp beads, we wanted to make sure that the protein between all the fluorochromes was just the same for the beads.

0:11:13 - 0:11:42, Slide 12

The staining index. What is the staining index and what does the staining index tell us? It lets us know how well-resolved a population is from the background or from the negative population. The brighter the signal above the background or the unstained in the sample, the higher the staining index.

0:11:42 - 0:13:01, Slide 13

For all the data required for each antibody from each titer, we calculated the staining index. You calculate the main fluorescence intensity or the MFI of the positive cell population, the MFI of the negative cell population, then calculate the staining standard deviation of the negative population, and then you give a formula where you subtract the negative MFI from the positive MFI and divide it by the two times the standard deviation of the negative population. As the negative population expands or the staining index the standard deviation will also increase which will then have an effect on the staining index. The staining index allow us to determine how well resolved our CD4 population is from the background or the negative population.

0:13:01 - 0:14:17, Slide 14

Here is an example of some of the staining index results we got and what we considered was the optimal titer. So, we did this for all the fluorochromes. I’ve also plotted the positive medium fluorescence and the negative medium fluorescence, and then in red we can see the staining index. You could also see that sometimes if a population - if you’ve got the highest mean fluorescence intensity on a positive population, that doesn’t necessarily mean that that higher titer or that concentration is going to give you the best staining index so sometimes you will get an increase in the negative which will have an effect on the total staining index or that final staining index number. It’s about getting the best separation between your negative cells and your positive cells.

0:14:17 - 0:15:13, Slide 15

Here is an example of some of the staining index results we got and what we considered was the optimal titer. So, we did this for all the fluorochromes. I’ve also plotted the positive medium fluorescence and the negative medium fluorescence, and then in red we can see the staining index. You could also see that sometimes if a population - if you’ve got the highest mean fluorescence intensity on a positive population, that doesn’t necessarily mean that that higher titer or that concentration is going to give you the best staining index so sometimes you will get an increase in the negative which will have an effect on the total staining index or that final staining index number. It’s about getting the best separation between your negative cells and your positive cells.

0:15:13 - 0:15:46, Slide 16

We also did some voltration experiments where we altered the voltage for each detector just to see what was the best voltage for the different fluorochromes and looked at the range where you do get a better signal between your negative and your positive cells.

0:15:46 - 0:17:00, Slide 17

The results. Here, we’ve got the response from one of our instruments, one of our Fortessas. It had five lasers and we’re able to configure the instrument to acquire data from all the fluorochromes we tested. So, we can see here that the two fluorochromes that give us the best staining index are from two different vendors. They’re excited by the same laser, collected from the same detector and also these two fluorochromes have got similar vector properties. What this is indicating is that the detectors that are picking up those fluorochromes are very sensitive to that spectrum and also those two fluorochromes are reasonably bright as well.

0:17:00 - 0:18:20, Slide 18

Here we’ve got four instruments. The two top instruments are identical in their optical configuration and the bottom two are also identical with their optical configuration. The two on the left are older instruments. The SH-04 and the F4A are older instruments and the two on the right have been purchased more recently. The two on the left were very popular. They’re usually quite booked out and they use a slight reconfiguration, they like to run their experiments, and they said, “Well, we need instruments that are exactly the same.” So, we purchased additional instrument with the same configuration to keep our users happy. We can see that the older instruments don’t have as good as response as the new ones. Later on in this presentation, I will try and address why this is, or why I think this is.

0:18:20 - 0:18:38, Slide 19

Here we have one of their cell sorters and you can see that the response to most of the fluorochrome is very good but the response to the fluorochromes excited by the 561 laser are very good.

0:18:38 - 0:19:10

Just before we perform these instrument performing characteristics. See, we had a few issues with the instruments, and the 561 laser died. So, we got a brand new laser, all the fibers, everything had been replaced and everything had been realigned which could be another reason why the yellow grain is giving us much superior signals to the rest fluorochromes.

0:19:10 - 0:20:00, Slide 20

This is the stream-in-air cell sorter. It’s out to MoFlo and, again, we’ve got an excellent response to the yellow green fluorochrome. One thing that we did do for this instrument when the new yellow - the new fluorescent protein dyes became popular, we purchased a reasonably high-powered laser to excite some of those fluorescent protein such as mCherry. So, the power from this laser is a lot, lot higher than the power on the other lasers on the instruments. It also can explain why we’re getting such a good response.

0:20:00 - 0:20:55, Slide 21

So, do the same fluorochromes have better response on our instruments from the different vendors? No. Not really. They’re all pretty much the same. We compared the same CD4 clone with the same fluorochrome from different vendors and found that they pretty much gave similar results on all our different instruments.

We’ve got the APC, FITC, PE-Cy7 and PE and all the instruments and the responses.

They all were where you’ve got a better response for one fluorochrome you also got a reasonably good response from the other vendors as well.

0:20:55 - 0:21:59, Slide 22

Comparisons between different fluorochromes but where you’re using the same detector. When you compare different fluorochromes from different vendors, there are differences. These could be due to the actual brightness of the dye but it could also be due to the specific collection optics of that instrument. You know, some filters maybe better suited for a particular fluorochrome than another and we can’t constantly be changing our collection filters to accommodate from one fluorochrome from one experiment to another. You can see that some of the emission spectra are reasonably close but they’re all not identical.

0:21:59 - 0:24:21, Slide 23

In an ideal flow cytometry world, our dyes would’ve made a narrow spectral band with no or very little spillover signal into different detectors. Reality is, fluorochromes emit into a whole lot of other detectors where you don’t want them to either to adjacent detectors but also to other detectors where you’ve got different excitation wide length. Sometimes the spillover can be quite significant and this will make multicolor or polychromatic cytometry and panel design difficult. When there’s spillover of this sort of fluorescents into multiple detectors, it will have an effect for a spring, the nominal negative cell distribution, and this can have a huge impact on the ability to resolve some populations especially those that have got a very low antigen expression. So, what we did after we obtained all these instrument characteristics, we used the feature in FlowJo that is able to calculate the spillover spreading matrix between the different fluorochrome combinations. Stuff like this makes it easy to recognize which fluorochrome combination should best be avoided as they do have a very, very high level of spillover into each other. If it cannot avoid using some of those combinations, you should try and assign the dim receptors to the bright dyes that do have a minimal spillover. In the table, we can see some examples where you thought our PE-Cy5 and a BB700. That value is quite high. If you had two dim populations on either of those fluorochromes, you may not be able to resolve that population very well.

0:24:21 - 0:26:22, Slide 24

So, here we have another table where we’re able to summarize all the spillover. You can see which fluorochromes are going to give you quite high spillover of that fluorochrome into the other detectors. You can actually calculate the total spread from each dye contribute. It kind of let you know whether that dye is going to work well or play well with some of the other dyes, which is very useful when you’re designing a new panel and also when you’re adding one or two extra fluorochromes or markers to an already prepared panel.

Down the bottom here, you can also calculate the total spread. Detectors that have got very low values will be most sensitive for detecting when you’ve got all the other colors used. This is very important when you’re trying to resolve very low dim population. Again, those dim markers try and put them on the brightest dye with the least spread to try and have some – it guarantees success.

0:26:22 - 0:27:51, Slide 25

I’m going to try and explain what I think some of the reasons are for the differences in staining index. Some of the things that may have an effect on the staining index and how well we’re able to resolve our cells of interest is that the laser power and one of our analyzers were able to change the laser power, and we ran some eight-peak rainbow beads just to see the effect of the laser power and the ability to resolve all eight peaks. As we increase the laser power, we’re able to resolve some of the dimmer bead populations. We can see that here we’ve collected two different – we’ve used two different collection filters for these beads and you can see that no matter how high you turn up the laser, some of the dim populations and notice were resolved in the higher wavelength, the 670-30 collection. That’s because the dye emission doesn’t peak around that area. It sort of peaks at the lower wavelength.

0:27:51 - 0:28:40, Slide 26

Here, we also looked at another analyzer where we’re again also able to change the laser power and the effect it had on dissolving eight-peak beads. The laser power is important. Maybe not so when you - for some of the bright expressing beads or bright expressing dyes but where you do have very low antigens or very dim populations, laser power can help improve the resolution.

0:28:40 - 0:29:02, Slide 27

Again, this is another representation of how the laser power does have an effect on dissolving the low bead populations.

0:29:02 - 0:31:14, Slide 28

Optical filters, band pass filters, dichroic filters, they’ve made up different layers in coating that allow you to collect the different wavelengths of interest. Earlier in the presentation I showed a couple of instruments which in theory should have been identical having the same optical layout that we saw that with some of the - we knew that some of the newer instruments gave us a better staining index than the older ones. One reason for this is that the band pass filters in the older instruments have degraded. So, environmental deterioration due to either moisture getting into some of the layers on the band pass filters will have a big impact on how well these filters behave. Brisbane’s in Australia or in the sub-tropics, our summers are hot and very, very humid. No matter how hard we try and control the humidity and take the temperatures at a constant level, there is still a lot of moisture in the air. I’m pretty sure that a lot of our vent band pass filters do degrade just because there is a lot of moisture in the air. Something that you can try out there is, if you take your filters out of your instrument if you don’t think they’re performing well, just put them up against some lot - you can actually start to see some degradation and when you do start to see that, it’s time to swap the filters. There are different vendors that do supply filters that as the quality of the filter goes up, so does the price.

0:31:14 - 0:32:56, Slide 29

In this slide, we can see the different effects of or the effect that filters can have on how well we’re able to resolve populations. Here, we’ve run some eight-peak rainbow beads and swapped only the band pass filter collecting the signal. Everything else was the same, the dichroics, the laser power and we used the same band pass filter of collecting 530/30 wavelengths. Some of the new filters may not be as good as one would think. Some of the older ones, you can see, are so bad that you can’t even resolve some of the higher peaks well. Band pass filters do play a big, big part in how well you’re able to - or how bright a signal you’re getting on your instrument. So, if your instruments are quite a few years old it’s really worth sitting down and checking. Have your filters degraded? Do they need to be replaced? If so, when you do replace them, you’ll find that there is a marked difference on how well your instrument will start to perform.

0:32:56 - 0:35:25, Slide 31

Photodetectors or photomultiplier tubes convert the photon of light being emitted from our fluorochromes and convert these into an electronic signal that we can then measure and display in flow cytometry. As the number of photons hitting the detector increases, the current, the resulting current also increases. Photomultiplier tubes have got different responses to different wavelengths. The manufacturers will select photomultiplier tubes for a range of wavelengths to be able to detect within the visible range. Some photomultiplier tubes have got much better responses to a particular wavelength than others and sometimes they may not be placed in the best, the most ideal spot for a very fit, most suitable wavelength. Years ago, with the instruments I had I used to remove all the photomultiplier tubes from an instrument and then test each one individually for different wavelength responses. I would then place them on my machine in positions that were best suited to collect that particular wavelength. The newer instruments these days make that a little bit more difficult. Everything is sort of bolted down and locked in the instrument. Some of the cell sorters still allow you to do that but most of the analyzers, it’s not so easy to do anymore.

Something else that people should be aware of if you do have very, very bright dyes where you got a very high signal going to that detector, at that higher detection rate, some of the photomultiplier tubes may not respond in a linear fashion anymore, and they do search a way for when the incident light is very, very high.

0:35:25 - 0:36:24, Slide 31

The Q, what is your Q? The Q value gives you an indication of how efficient the detector is able to convert that light signal or that fluorescent signal to an electronic signal or photo electron. You’ve got a cell that has 10 fluorescent molecules being emitted that hit the detector and the conversion is five photo electrons then the Q value would be we’ve got a 50% conversion. The average conversion in most flow cytometers, which is very good is around maybe 20-30%. Not every single fluorescent molecule gets converted into a photo electron.

0:36:24 - 0:37:51, Slide 32

What kind of things affects the Q value? As I showed earlier, laser power but also laser alignment. You may have a laser with really good power but if it’s not aligned properly and it’s not focused properly, you’re not going to get the most optimal signal. You really need to make sure that you look after your instrument. You’ve got to have a clean flow cell or nozzle. Make sure all your stirring, all your objectives are clean, the lenses are clean. If you’ve got mutual density filters in front of some of your detectors, that will also decrease the Q value. As I said before, PMT sensitivity. Some detectors give a better response to some wavelengths and some less sensitive especially in some of the red wavelengths, and also how well does that PMT response to different wavelengths.

0:37:51 - 0:39:18, Slide 33

Linking dye brighteners and fluorochrome response on individual flow cytometers, they are different. These P&Ts have got different responses to different wavelengths. Your cytometer model, how - what it’s optical configuration is. How much noise you actually have in your detector will play a part as well. Do you have good clean power or is your lab reasonably noisy? All these things will have a part. We know that some dyes are brighter than others that because of the variation in instruments, some instrument the collection optics you will get a different response. Also, something that is important is how well you look after your instrument and how clean the optics are. If things do start to degrade, it really is a good idea to replace some of your band pass filters.

0:39:18 - 0:40:38, Slide 34

What can we do to improve how we select different markers associated with fluorochromes when designing our panels? Ranking fluorochromes for instruments, which is what we’ve done is a good start. What we’ve done here on one of our Fortessas is that we’ve grouped the different fluorochromes into very high, high, moderate, low to very low. The actual base groups of high or low fluorochrome brighteners are going to be different for each instrument. The fluorochrome is suddenly going to be very high on one instrument, may not be the same on another instrument. So, people need to be aware of that if they’re going to be staining a certain panel using this design or one of the instruments. They may not get as good a result on some of the other instruments.

0:40:38 - 0:42:40, Slide 35

Things to consider when you are going to be designing multi-color panels, you really need to determine, define how good your dye brightness is on an instrument. Try and avoid any fluorochromes that do spillover. The less overlap there is, the better your resolution. Another suggestion is, if some of the dyes are overlapping then try and use markets for those of overlapping fluorochromes that don’t or express something like T and B cells. Try and choose bright dyes, some of the low expression markers especially the fluorochromes that don’t have a lot of spillover. Understand the biology of your antigen or what you’re trying to – what the question is you’re asking with your experiment. Define the dye brighteners for each analyzer and then also consider the spreading error of that dye on the instrument that you’re going to be using.

Also something you need to take into consideration is some of the markers you’re looking at transient. Do something biological have to happen so that marker should be expressed? In those cases, really try and use fluorochromes that are bright or have very minimal spillover. Know your flow cytometer.

0:42:40 - 0:44:14, Slide 36

What do we need to do? Well we continually update and educate our facility users about dye complexity and panel design. We emphasize the individual optical characteristic within the facility, and we’ll sit down with our users and help tailor their panel to the instrument of their choice. Or we’ve ranked all our dyes and use some of the features and software that will allow you to calculate the spread of your fluorochromes for that particular cytometer, and provide this as a really good resource of facility users to help guide them, prepare the panel design and some of the more complex panels that we’ve been doing for the clinical trials that we do here. We aim to give the best advice and service because we are a service facility to our users and we will modify our instruments as new dyes come on the market doing help to further improve the service to our users.

0:44:14 - 0:44:52, Slide 37

I just wanted to thank the rest of the gang here. It was a pretty major day or days, multiple days when we do this work. Every single person had an instrument and they basically almost spent the whole day on each instrument running all the different meters and all the different fluorochromes just to get all these started together. I guess it’s for everyone and thank you. Any questions?

0:44:52 - 0:45:14 

Thank you, Grace for your presentation. A quick reminder for our audience on how to submit questions. Simply type them into the dropdown box located on the far left of your presentation window labeled “Ask a Question” and click on the “Send” button. Grace Chojnowski will answer as many questions as time permits.

0:45:14 - 0:46:10

First question is: How important is titration of each antibody conjugate used?

It’s very important. As we saw with the staining index, having the highest titer doesn’t always give you the best separation of the cells of interest. You may actually start to increase the background a little bit more if you do have too high a concentration. Then again, if you don’t have enough, if the concentration of your antibody is too low, you’re not going to get a very good signal either. So it is crucial that you do titer your antibody before you start doing any experiment.

0:46:10 - 0:46:26

Thank you. You mentioned doing an antibody titer using comp beads. How does this compare to titration on the cells and what are the advantages and disadvantages of doing titration using comp beads?

0:46:26 - 0:48:00

The advantages are that you don’t need to collect large volumes of blood for all the fluorochromes. You can imagine 61 different antibodies with numerous titers. We wanted to make sure we had the same sample. You do need to collect a lot of blood, which some volunteers are happy to do but after a while – but with comp beads if you get the same lot number, the concentration is the same. The only disadvantage with using comp beads, they do have a lot more receptors on the surface, so the amount of antibody that will bind to them is always a lot higher so you will end up using a lot more antibodies. What we did do, we really try to make sure that the concentration that we used was constant with every comp bead. There’s pluses and minuses to both but comp bead they’re there all the time and there isn’t as much effort in preparing them as what you need to put in when you are using peripheral blood.

0:48:00 - 0:49:43

Thank you. The next one is, I like how you classify dye into brightness categories, say dim, moderate, bright, very bright. What about doing this for antigen density information, say low, medium, high density?

Yes. That is important as well. If you do have a high antigen density, you would rank the way you know what you expect or whether you are going to get a high antigen density and you would use the cells or you would use the fluorochromes that maybe aren’t as bright with some of the antigens that are bright. For example, something like CD-8 has got a very high antigen density you could use some of the dimmer dyes on something like CD-8, whereas, other markers that may have a very low antigen density, you will design some of the brighter dyes. You would also try and use the dyes that are bright and also ones that don’t give you too much of a spillover. The antigens that you’re not sure of, or that you’re testing for, and you don’t know what their expression is going to be, or ones where there is a transient expression, again you would try and assign dyes to those that don’t have too much of a spillover and are reasonably bright as well.

0:49:43 - 0:50:00

Thank you. The next one is, can you summarize the instrument characterization studies you recommend doing for any flow cytometer? Does this characterization ever need to be repeated?

0:50:00 - 0:51:18

It should be repeated every time if you have your photomultiplier tubes changed or the instrument’s realigned, you get a new laser, any of the optical components are changed, it will make a big difference. One of the older instruments that do give us a poor response as we’ve started to purchase a new filters, and we haven’t repeated the studies but once we start doing this again we’d be able to see a marked different and just show the difference between what the characteristics were before we replaced a lot of the filters and then how the instrument behaves after some of the optical components were changed. Realistically, it will - the characteristics will change as soon as you change anything on the instrument. Also, over time as I showed in some of the slides, things will degrade and the characteristics won’t be as good as what they may have been a few years prior.

0:51:18 - 0:51:31

Thank you. Looks like we have time for one more question. That will be this one. What do you think is the main reason for the differences in the staining index on different instruments of the same dye?

0:51:31 - 0:52:41

The main difference could be the laser depending on some of the yellow green dyes. If you’re exciting them with the yellow green laser then you are going to get a better response. Some instruments don’t have a yellow green laser, they’re using a 488 excitation, which isn’t as optimal for those dyes. You’re not actually exciting the dye that’s optimal absorption. The other component which I think is very important are the actual collection filters are found that they make a huge difference as to how good a signal you get depending on the quality of the filter and whether it has degraded. Also, to some effect the actual photomultiplier tube, some just have a little bit better sensitivity just setting spectral wavelengths.

0:52:41 - 0:52:51

I would like to once again thank Grace Chojnowski for her presentation. Grace, do you have any final comments?

0:52:51 - 0:53:09

No, but if any - I can’t think of any at the moment. If anybody would like to ask any questions, please feel free to contact me. My contact details are on our website, the QIMR Berghofer website.

0:53:09 - 0:53:45

Thank you and I’d like to thank the audience for joining us today and for their interesting questions. As Grace said, questions we did not have time for today will be addressed by the speaker via the contact information you provided at the time of registration. We would like to thank our sponsor, Thermo Fisher Scientific, for underwriting today’s educational webcast. This webcast can be viewed on demand through January 2019. LabRoots will alert you via email when it’s available for replay. We encourage you to share that email with your colleagues who may have missed today’s live event. Until next time. Goodbye.


Analyzing your CRISPR gene editing efficiencies using flow cytometry

In this webinar, Natasha Roark, scientist at Thermo Fisher Scientific, discusses CRISPR gene editing technology and its role in flow cytometry editing efficiencies.

00:00:00 - 00:00:21

Welcome to our webinar, "Analyzing your CRISPR Editing Efficiencies Using Flow Cytometry" presented by Biocompare and sponsored by Thermo Fisher Scientific. My name is Peter Fung. I'm the managing of Biocompare, and I'll be the moderator for today's presentation.

00:00:21 - 00:00:38

Before we begin, I'd like to inform our viewers that this event will hold a live question and answer session at the end of the presentation, and you can submit a question at any time using the Q&A box located on your screen. Also note the resource list containing documents and links pertaining to this webinar.

00:00:39 - 00:01:19

Now allow me to introduce today's presenter, Natasha Roark is a scientist in the Synthetic Biology Department at Thermo Fisher Scientific. She received her Bachelor's Degree in Science from the University of California Santa Cruz, and it was in health sciences, and molecular, cellular and developmental biology. Her current work is focused on molecular and cellular biology, high-throughput cloning, genome editing, cell engineering, and cell analysis. Today, she'll share some examples of using flow cytometry in genome editing applications and cell engineering workflows. Welcome, Natasha. We're looking forward to your presentation.

00:01:19 - 00:01:38, Slide 1

Thank you so much for having me. Today, I'm going to discuss analyzing your CRISPR editing and genome editing efficiencies using flow cytometry, and discuss how we incorporate flow cytometry and use it as a tool in our general genome editing workflow.

00:01:38 - 00:01:53, Slide 2

First, what is genome editing? Genome editing is an approach whereby we insert or modify or replace a DNA-based in a genomic DNA sequence.

00:01:53 - 00:02:50, Slide 3

Why do we use genome editing? We use genome editing to study a gene function, creating a knock-in or knockout, to create a target specific gene mutation, perhaps a SNP fraction, to targeted transgene edition or create a heritable modification. We'll also use it to label endogenous genes, stabling, integrate an expression cassette into a desired cell line, and then for general tissue and cell engineering purposes to produce novel functions. This can be used to create very powerful animal disease models. Of course, be used for high-throughput screening and tissue disease modeling. Of course, also for stem cell engineering, and even for application such gene therapy and agricultural science in which we can create disease-resistant transgenic plants.

00:02:50 - 00:03:16, Slide 4

A little background on genome editing, there are traditional methods to genome editing that you may be familiar with, a couple of these are engineered meganucleases or zinc-finger nucleases. More recently, we used transcription activator-like effector nucleases, called TAL effectors or the CRISPR/Cas system even more recently, clustered, regularly interspaced short palindromic repeats.

00:03:16 - 00:03:46

TAL and zinc-finger nucleases are similar. They’re proteins with programmable DNA-binding domains that can be engineered to bind to a specific DNA sequence allowing targeting a specific genomic mutation at a specific genomic locus. CRISPR/Cas however is an RNA guided, DNA endonuclease system that leverages short non-coding RNA to direct the Cas9 endonuclease to the DNA target site.

00:03:46 - 00:05:07, Slide 5

The CRISPR/Cas9 is truly becoming a transformative technology in the genome editing field, and as you could see by the amount of publications and applications that are now being used with CRISPR/Cas9, it's all over the news. It's had a huge impact to the entire field. A little background on CRISPR/Cas9, if you're not familiar. It's a bacterial endonuclease. It's a small RNA, a guide RNA that's actually a chimera of a target-specific CRISPR RNA, which confers to a specificity using standard Watson-Crick base pairing rules, and tracer RNA which associates with the Cas9. These together we collectively expressed in a million systems as a chimera that we referred to as a guide RNA, and these can be targeted to a genomic locus to introduce a DNA double stranded break. Virtually, anywhere that contains a pan sequence or an NGG sequence. CRISPR/Cas9 has been so readily adapted and has been such a transformative technology really because never before have we have a genome editing tool that's easier to design. It's high efficiency as CRISPR has and it's very easy to implement into any workflow.

00:05:07 - 00:06:05, Slide 6

As a little background of the CRISPR/Cas9 system. I already mentioned that CRISPR/Cas9 stands for clustered regularly interspaced short palindromic repeats. Now, where do this come from? This is a prokaryotic adaptive immune response system that converts resistance to exogenous nucleic acids, for example, phage DNA, and it's an RNA-guided DNA endonuclease system. How it works is following a viral invasion, bacteria will integrate of small piece of foreign DNA into the CRISPR loci in its chromosome. These loci are then transcribed into short CRISPR RNAs that are complementary to the previously encountered foreign DNA. This prepares the bacterium for subsequent viral attack, and it forms complexes with these Cas proteins to form effector complexes to guide detection and cleavage of the target DNA.

00:06:05 - 00:06:23, Slide 7

Precision genome editing using engineered nucleases all follow the same basic principle, and lots of these engineered nucleases use a double strand break in the specific location in the genome. These double stranded breaks are then repaired by the endogenous cellular machinery.

00:06:23 - 00:06:35

Here we have an example of the double stranded piece of DNA. We can add our engineered nuclease, the zinc-finger nuclease, the TALEN or CRISPR Cas9, and we get a resulting double stranded break.

00:06:35 - 00:06:48

Now, once this double stranded break occurs, they can go through two different pathways. We have non-homologous end joining and homologous with combination.

00:06:48 - 00:06:48

Now, these two different pathways act together and often conflict with one another, these different enzymes utilize different proteins for this type of repair. Non-homologous end joining is definitely the most prevalent. It occurs at a much higher frequency, and this is a very random process. As the endogenous cellular machinery repairs this double stranded break, it's messy and it generally results in very large indels, large insertions or deletions of DNA at the target locus. This is really generally used to create a knockout; to knock out a gene, to knock out a protein of interest. Alternatively, homologous or combination is much more specific. In this case, we actually introduce a donor DNA with our genome editing tool, and this allows us to create perhaps a very precise knockout, perform a gene fraction, a SNP, for example; a small deletion, insertion, or a knock-in, and we indeed use it for all of these applications. Mutating a promoter, adding a promoter, adding a gene or even adding an endogenous tag.

00:06:48 - 00:10:01, Slide 8

One of the methods that we use to assay our genomic cleavage detection is GeneArt Genomic Cleavage Detection Kit. This is a very quick and relatively straightforward and very quantifiable way to assay your genome editing efficiency at a given locus of the actual endogenous DNA. This is an essential tool for monitoring the efficiency and it's much quicker and much less expensive than sequencing, for example, this would be quick check of your actual gene modification. How this works is we transfect our cells of interest with our genome editing tool desired, this could be the GeneArt CRISPR nuclease factor in this example. Once we transfect these cells and allow the tool to work, we then harvest these cells, pellet them down, and create a cell lysate. We then designed target specific primers linking that region of interest. So we’ll amplify that region and that indel containing that – that region of interest containing that modification and of course, the indel that was created by non-homologous end joining. After PCR amplification, we then denature and re-anneal this PCR amplicon to form heteroduplexes, which creates mismatches in this DNA, which can then be cleaved by our mismatch detection enzyme, and then ran out on a gel. You can see here from this animation, this image, that this is a very quantifiable and very quick and accurately to measure genome modification efficiency. It’s easy, takes about five hours start to finish and it's actually quantitative. We've confirmed this with the next-gen sequencing.

00:10:01 - 00:10:44, Slide 9

Today, I'm going to take you through this general cell engineering workflow, which we really strive to address from start to finish in my department. That's the premise of designing, delivering and detecting. We start by identifying our target sites, designing and ordering the appropriate tools, delivering these editing tools to our healthy cells with the new high transfection efficiency, and then detecting, validating, quantifying our genome editing efficiency or even our knockout efficiency through various products that we're creating or various internal assays that we use.

00:10:44 - 00:12:41

I have few examples here of the variety of the products that we use in our day to day workflow and that we also offer in our workflow as far as options for cell engineering. Beginning with design, we would start with the GeneArt Crystal Search and Design Tool portal where we insert your query, our genomic DNA of interest. Binary optimal CRISPRs, which is based on our proprietary algorithm using the most updated rules and principles we know about guide RNA design. Once we choose our guide RNAs, we can choose delivery depending on our application. We may want a vector, in which case we could use the GeneArt CRISPR nuclease vector. We may want IVT using the GeneArt precision guide RNA synthesis kit, or maybe we want stable integration, in which case, we've used our Invitrogen LentiArray CRISPR Libraries. Then for delivery, we have a variety of options as well. We'll use Neon transfection, lipid-based methods, again Lenti as the system for delivery, GeneArt Platinum Cas9 nuclease, but what I really want to focus on most today is the detection, the confirmation of this genome editing, and different ways that we use flow cytometry and imaging techniques to help us assay and quantify our genome editing and improve our genome editing conditions. Here I'm going to talk about how we use the Attune NxT Flow Cytometer, how we use our GeneArt Genomic Cleavage Selection Kit, we use EVOS Imaging System, and of course, in the end, all of these methods are verified using Ion PGM or Sanger Sequencing.

00:12:41 - 00:13:02, Slide 10

First, I'm going to talk about our genomic cleavage selection reporter and how we can use that with TAL effectors or CRISPR guide RNA. This is a nice validation tool that we have for genome editing that really have a lot of versatility and it's very - quite simple to use.

00:13:02 - 00:13:44

On the left here, we have the Genomic Cleavage Selection Vector. This vector consists of a promoter, a V5 tag, and an OFP coding sequence for orange fluorescent protein that's disrupted by the target sequence binding site. So, in this sequence in between the OFP will have either the TALEN-binding site or the CRISPR guide RNA binding site cloned into the vector. This vector comes linearized and we just order oligos with the correct overhangs, phone them in, and now we have a disrupted OFP, fire target sequence as well as a T2A self-cleaving peptide with the CD4 coding sequence.

00:13:44 - 00:14:18

Now that we have this vector, we can co-transfect this reporter vector with nucleases, CRISPR or TALEN. Upon successful cleavage by the reporter nuclease as well as the endogenous sizer repair mechanisms repairing this double-stranded break. Similarly, we do so on the endogenous DNA, now we will have OFP+ cells, so fluorescing orange cells as well as CD4+ cells being expressed on the membrane.

00:14:18 - 00:14:57

Now, the nice thing about this system is it's very simple to design and construct and it’s very quick. Within 20 hours of your transfection, you can generally see OFP expression under the fluorescent microscope, and that will really tell you three things right away. It will tell you that your design is correct, that your genome editing tool and your vector are entering the cells and give you some idea to the efficiency to which it’s entering and cleaving, and it will also tell you that you designed these properly, which is key.

00:14:57 - 00:17:41

There are sometimes issues with expression of CRISPR. There are promoter constraints. There are a variety of things that can affect the efficiency. This actually provides a way to assay for our genome editing tool. You could check several different CRISPRs using this and then have a very quick and rapid method for which you assay the cleavage efficiency of the CRISPRs using flow cytometry. You can see in this panel here, this is exactly what we did, that we are showing three different cell lines here, 293SPs, U2OS and A549 cells, as well as three different targets; AAVS1 and HPRT safe harbor loci, and the IP3R2 locus. On the top panel, we co-transfected this reporter with the AAVS1 guide RNA or HPRT1 guide RNA, and also co-transfected with an irrelevant guide RNA to take into account background subtraction or any auto-fluorescence resulting, and then measured the OFP expression using flow cytometry. Got a nice quantifiable result for how well these cleavage tools are working. On the bottom channel, the same experiment was done but using TALENs. You can see these are slightly lower efficiency and they also are, while still worker cell lines, not quite as robust in their OFP expressions. This actually correlates quite nicely with genomic cleavage detection and sequencing methods. Of course is exogenous DNA. This is exactly what it's called. It’s a surrogate reporter and you're not going to have consideration such as the accessibility of the locus, for example, and it's not going to be able to tell you if you have a bi-allelic knockout, but it's a very quick and easy and quantifiable way to look at your genome editing tool in the cell.

00:17:41 - 00:17:52, Slide 11

It also offers that added benefit of being able to enrich. Oftentimes, we're working with very difficult to transfect online, maybe difficult loci. CRISPRs are pan limited. You are limited by that proto spacer motif, and sometimes there are not as many options as you would like and you may have get the mutation, the targeted mutation that you want to go with the CRISPR with slightly less efficiency, and you really want to be able to do everything you can to enrich for these edited cells because one of the biggest bottlenecks in the workflow is taking these clones down, isolating these single clones and carrying these out, and then performing sequencing on them, for example.

00:17:52 - 00:19:12

This provides a method to enrich for these cells. We can either use Dynabeads CD4 base for enrichment in which we use a CD4 coded Dynabeads and pull these cells out of solution using magnets, or we can use fluorescent cell sorting to sort for OFP+ cells. This shows - that’s exactly what we did here. You can see this gel below. This is the genomic cleavage detection assay of cells pre and post enrichment. Enrichment with OFP and CD4. In Lane 1, you can see that you have virtually no genomic cleavage as quantifiable by the gel. In Lane 2, there’s a flow through sample from the CD4 beads, but then Lanes 3 and 4, you can see now that this efficiency is now 19.3% and 13.9% respectively, which is greater than 10-fold enrichment. Now in this case, 19 and 13 still are relatively low, but this now makes a nearly impossible experiment as far as the amount of work and plates that it would take to carry out this clone into an experiment that's very workable and very doable.

00:19:12 - 00:19:44, Slide 12

We've also quantified all these different methods. Here's an example of analyzing genome editing efficiencies detected by the Genomic Cleavage Selection Kit, the Genomic Cleavage Detection Kit, and NGS sequencing. In this workflow, we designed perfect match TALENs, transfected the cells with the Neon transfection system, and then use the Genomic Cleavage Selection Kit as well as amplify the target loci from the cell lysate.

00:19:44 - 00:20:38

Then we took these through Ion PGM Sequencing and the Genomic Cleavage Detection Kit. Now, in this case, you can see that the Genomic Cleavage Detection and the NGS data really correlates quite nicely with one another. The GCS vector is also correct. It is slightly lower and we do see this in different cell lines. Again, this is a reporter. It's going to depend on the cell health, how well this cell expresses OFP, and again, it’s not able to detect a bi-allelic knockout. In some cases, it will overestimate your genomic cleavage. In other cases, it will underestimate. It's really just meant as a means for assaying those.

00:20:38 - 00:22:26, Slide 13

Another method that we have where we use flow cytometry to help our genome editing workflow is with our GeneArt CRISPR Nuclease Vectors. We use this for genome editing and transfection efficiency monitoring simultaneously. We have GeneArt CRISPR Nuclease Vector kits. It's cloning very similar to the GCS Vector that I previously described, but in this case, you're actually going to clone and/or guide RNA sequence. There's a U6 promoter. There's also a Cas9 and then you have the choice of either CD4 or OFP being expressed in this vector. Now, this provides another opportunity for analysis and enrichment. You can see on the top panel. You can see this and very quickly get an idea of your transfection efficiency at the same time that you are delivering your genome editing tool, and you can analyze that via flow cytometry, use it to optimize conditions. Similar to the GCS vector, you can also sort these cells. In this example here, I have a pre-sort population of 9.5% of cells were OFP+. Sorted these on OFP+ cells and ended up with a post-sort OFP+ population of 99%. Saved these samples and ran the Genomic Cleavage Detection kit on these and you can see that pre- and post-sort, I got almost 50-fold enrichment, from 1.4% to 48.1% cleavage. This is again a very valuable tool particularly when we're looking at very difficult to transfect cells, or again cells that are very limited as far as our options for genome editing tool.

00:22:26 - 00:22:57, Slide 14

Again, this is also available on CD4. This is just an example. The CD4 Vector is also quite versatile. We can conjugate this with the CD4 conjugated Alexa Fluor 488 antibody, which is what I did here, and analyze this on the EVOS, ran flow cytometry and then also the Genomic Cleavage Detection Kit. This gives us ways, too, for correlating transfection efficiency with cleavage efficiency evaluating different tools and assay optimization.

00:22:57 - 00:26:09, Slide 15

I'm going to switch gears a little bit and talk a little less about delivery, but more on some of the general tools that we use flow cytometry for in the lab. Once we get our edited clone, we really need to validate just like any cell line. This is the cell line we're going to use, we're going to keep, we're going to provide to customers, we're going to do a tremendous amount of downstream activity on these cell lines, and having high sensitivity and resolution of those fluorescent proteins and dyes is very, very important as far as our option for working with different fluorescent proteins and dyes. Because we're genome editing, we're looking at mainly cell expression or gene modulation so we work with fluorescent proteins. Fluorescent proteins can be a little bit difficult to work with in that they have very large emission spectrum. They're excited by multiple lasers. There's a lot of bleed-through, and they can be a little more difficult to resolve than dyes. Furthermore, we want to be able to resolve these with dyes. So, here's an example of some of the fluorescent proteins that we use: MKD, TAG, BFP, OSP, Emerald GFP, and RFP. We regularly use these in conjunction with viability dyes, and we use these to validate our cell lines that we're making, and it's really increased our options as far as creating these different cell lines because we have four lasers to work with and 14 possible colors. So, you can see in the example above, this is a cell line I was making. It's a U2OS NF-kB GFP fusion protein that’s also stably expressing Cas9. So, in doing this, I wanted to screen several different clones and pick the best one. You can see that not only can I very clearly see the separation between this dimly expression GFP and the negative sample, but I can also distinguish SYTOX positive cells from GFP positive cells. It's really that sensitive. Additionally, in the bottom panel, you can see the same panel, from left to right, we have the U2OS Cas9 cells. The U2OS NF-kB GFP Cas9 cells, and then treated with SYTOX Orange. There's really no overlaps between the two panels and you can very clearly distinguish SYTOX Orange and GFP+ cells, as well as SYTOX Green with GFP+ cells. This really offers us a lot of different options because we make a lot of reporters using a variety of different fluorescent proteins, and it's really nice not to be limited by our analysis tool for this detection.

00:26:09 - 00:27:25, Slide 16

As I spoke to you before, we use it for clone selection and validation as well. As we're creating these stable cell lines, often I'm screening dozens upon dozens of clones and wanting to choose the best ones. The high-throughput capabilities of the Attune [NxT Flow Cytometer] and really how fast and how easy it is to use and how straightforward the workflow, it makes a huge difference in our workflow, and that I can run a 96-well plate, screen 96 clones in about 30 minutes with virtually no sample prep, just briefly trypsinize the cell, quench when the FACS buffer, and then put in the Autosampler, and 30 minutes later, I have my results. This here is an example of working with the same cell line, taking the clone that I wanted for my desired mutation. You can see in green, I highlighted what I found to be the best one; H11 on the far right, given that it has the lowest SYTOX Blue-positive population, as well as the highest GFP+ population, most homogenous cell lines.

00:27:25 - 00:29:32, Slide 17

Another method that I used flow cytometry for, and we employ in our group on a regular basis, is for production of our Invitrogen LentiArray CRISPR Libraries. Currently, one of our new product offerings are LentiArray CRISPR Libraries and you can see the plate configuration on the right. This consists of four guide RNAs per gene, one gene per well, and 80 genes per plate. These array libraries are separated by family. For example, we have the kina, we have the protein receptors, we have the phosphatase library and so on and so forth. So, we’re providing this not only as glycerol stock ready for DNA preparations, but also as a high-titer arrayed virus for screening. Every gene of every library actually goes through an antibiotic selection titer to confirm that it meets our product specifications. This is an extremely time-consuming and laborious process, so we use GFP to optimize this titering process, as well as to provide some guidance for the customer. These are controls that we also sell that are available for sale to the customer, and that we use internally to optimize our transfection conditions, our transfection conditions, our antibiotic selection conditions. We can also use as negative or positive controls from gene modification, and we use it as a high-throughput titer check of our Invitrogen LentiArray CRISPR Libraries. You can see the vector appear on top. There’s a U6 promoter driving the guide RNA expression, which is cloned in for each site, and then we also have in this vector, in the control vector, GFP and puromycin. The GFP to aid us in titering and then the puromycin for selection.

00:29:32 - 00:29:58

We have two different particles that we provide. One is a positive control. This has – expresses the guide RNA targeting the HPRT sites. This is a high cleavage, high activity, safe harbor site, that the customer can use as an actual control for genome editing, genome modification, and then we have a negative control particle, which expresses a scrambled guide RNA, which matches nowhere in the human genome.

00:29:58 - 00:31:15, Slide 18

Now I mentioned we do an antibiotic selection of every well in this library, but again that’s very time-consuming and very laborious, and we want to have a quick check that we have internally before we go through that entire process of titering every single well. So our general workflow is this: We seed ourselves, then we transduce ourselves with our newly made lentivirus, change the media on day three, and then as early as day five, we can perform high throughput GFP titration using the Attune [NxT Flow Cytometer]. Again, it takes half an hour per plate, and yes, I think that’s a very quick, very quantifiable titer check before moving 10 days down into the workflow through antibiotic selection. You can see on the bottom left, this is an example the plate map provided by the Attune [NxT Flow Cytometer], it gives us a very quick picture of our titer throughout the wells, our different dilutions - throughout the plate, rather. Then on the right, you can see our calculation for the titer of this plate. We’ve developed a calculation for the titer which is the percent of GFP+ cells times the cell number and dilution factor divided by the volume of viral inoculum.

00:31:15 - 00:31:46

Again, this is very quick, this is just taking the cells, trypsinizing them, quenching them and running. We have three Attunes that we use and these run on a titering day all day long. One great benefit is we see very, very little clogging or very few problems with that even though we’re going directly from the plate with very minimal cell preparation.

00:31:46 - 00:33:10, Slide 19

Now, I’m going to talk about how we use flow cytometry to actually assay major our GFP knockdown efficiency. I talked about genome editing and about target reporters and titer checks and transfection optimization, but we also use flow cytometry to actually measure the knockdown, which is really the goal of these libraries that we’re creating. So, we’ve designed several different assays that will help us to do that, and here’s one. We have a HEK293 GripTite GFP stable cell lines, which is stably expressing GFP. Then, using our design algorithms, design, clone and produce lengthy CRISPR guide RNA viral particles targeting this GFP. We then co-infected these with Cas9 and NF3, followed by antibiotic selection for three, seven or 14 days, and then at each time point, analyze the GFP knockdown efficiency, genome modification efficiency at each time point. You can see up here, here’s an example of the triplicate sample plate map of one of the selection times, the fluorescent microscopy knockdown, and an example of the genomic cleavage detection at this point.

00:33:10 - 00:35:09, Slide 20

So, here in the middle, we have the overlay flats of each of these different guides. Let’s start with our cytometry data on top. We have three, seven and 14 days moving down. In the last panel, we have target one, target two, target three and target four, and then on the right panel, we have both the GFP pool and the scramble pool. The GFP pool is a pool of targets one, three and four to simulate the design of our libraries, and the scrambled pool is a pool of different scramble, three different scramble guide RNA’s that also match nowhere in the genome, and we went ahead and infected these cells and assayed them based on a single guide, as well as pool guides. You can see they have the overlay plots that these decrease over time, but in very different rates. So, on the left, I’m looking at the ANOVA percent GFP, and you can investigate the selection conditions saying that three, seven and 14 days, this changes quite a bit for some targets but less for others as far as the GFP efficiency. Of course, everything increases finally at 14 days, but we want to make things as quick as possible, and based on the data from this experiment, seven days seems just fine to get to get maximal selections. On the right, you can see a ANOVA analysis of our genomic cleavage efficiency in this experiment, and while the genomic cleavage correlates, it’s not the same as looking at the actual knockdown. There’s a few mysteries here wondering why target three, for example, gives a very decent genomic cleavage, but really has much left knockdown than the others.

00:35:09 - 00:36:34, Slide 21

So, breaking this up by target and then looking at data selection at each target, you can see that target one, the GFP knockdown increases dramatically at seven days post-selection and then modestly to 14. Target three goes through a step-wise GFP decrease, and target four really drops at seven and almost to nothing and virtually stays there. Similarly, the pool targets drop dramatically at seven and stays there. The other information that can be gleamed from this is that our pool design is really quite good, and that the pool guide RNA targeting a specific gene should be as good as the greatest genome editing tool in the pool, and that’s one of the reasons we like to design four per gene, so it really maximizes your efficiency of getting that knockout. Because as we’re learning more and more about CRISPR, that these targets, actually these non-homologous end join that results, actually does happen in a pretty predictable matter. We just haven’t completely elucidated what that is to correctly make that prediction. So, this definitely increases our ability to get a gene knockout over time.

00:36:34 - 00:37:17

Looking at the pooled analysis, you can see that the GFP knockdown for each target is quite different, but it really varies more by target than it does by data selection so you can see on the right hand histogram here, that really, there’s no – when pulling all the data together because we really want to use this as an example to provide guidance for customers and guidance to ourselves in optimizing these assay conditions, that seven days appears more than sufficient to get the desired gene knockdown, and base it with this model.

00:37:17 - 00:38:45, Slide 22

Okay, so in addition to using flow cytometry to verify our knockout efficiency, we can also use flow cytometry and these fluorescent proteins to verify homologous recombination. So, in this particular experimental setup, we take advantage of the sequence homology between BFP, blue fluorescent protein and GFP, green fluorescent protein. These differ really in this specific sequence for them in just one nucleotide. As you can see from the panel on the right in the control, so we have a cell line that is expressing BFP, and when transfected with the IBT guide RNA and the Cas9 mRNA, we see a restoration of this GFP. Now we have an assay for these quantifiable and verifiable for HR efficiency and have a nice reporter assay for this, and the other nice thing is it’s stably integrated, so we can now have this stably expressing BFP cell line and use it to optimize all sorts of assay conditions. You can see from the video down here playing on the right, over time, that you can very easily see the GFP and the homologous recombination restoration of this GFP over time.

00:38:45 - 00:41:23, Slide 23

Now, using that assay, we can further investigate homologous recombination. So now, here are two of the different assays we have developed for assaying homologous recombination. One is the conversion of BFP to GFP using that one nucleotide difference. Another is creation of a GFP mutant and then restoring that GFP expression with the donor. So, you can see the designs up here on the top, the six nucleotide change on the left, and the one nucleotide change on the right, and the fluorescent images when we transfect with just the Cas9 RNP, that’s the Cas9 protein with the Cas9 donor, and then the Cas9 RNP with donor. You can see that we get very efficient homologous recombination and we can use flow cytometry to assay this, which we did down below. After we developed this, we then further investigated this and tried many different conditions to very quickly and in a high throughput manner, analyze and quantify the methods that most effectively support homologous recombination. So, you can see this is all GFP data. For flow cytometry, we co-transfected with subs and in a sense, oligos to see which one of them is more effective. We also try sequential delivery meaning starting with the RNP followed by the donor DNA and then also by the donor DNA followed by RNP, and then did a dose response. Did the similar thing, similar experiment, on the right, this is the one nucleotide that I previously described, one nucleotide change that we used. In this case, we see the BFP convert to GFP and use this to also optimize sequential delivery, and dose response – sorry oligonucleotide and RNP dosage for electroporation. Again, electroporation, that’s another thing that we had to optimize, and that varies greatly and it’s going to produce very different efficiencies, based on the different cell lines and based on the type of mutation, targeted mutation you’re trying to make. We use all of these as assays to optimize our conditions before moving on to an endogenous gene of interest.

00:41:23 - 00:42:09, Slide 24

These are some of the flow cytometry applications we use at our genome editing and cell engineering workflow. I had talked about how we used fluorescent surrogate reporters and genomic cleavage, how we actually use these reporter-type systems to deliver and monitor the transfection efficiency of genome editing tools, how we can use these to enrich for genetically modified cells, how we can use this for a high throughput viral tittering and workflow optimization. I talked a little bit about knockout screens and cell line validations, and really the next thing that we’re focusing on is on using flow cytometry to perform various phenotypic screens.

00:42:09 - 00:42:54, Slide 25

Our goal, again, is really to create complete and optimized validation solutions for at these genome editing workflows from start to finish, from design to validation and flow cytometry is an integral part of this entire workflow, not only being used for analysis and assay optimization, QC checks for our products, but it’s also a means to enrich for ourselves and validate our chosen cells, via proliferation assays or LIVE/DEAD cell stains, and so on and so forth.

00:42:54 - 00:43:11, Slide 26

So, I’d like to acknowledge my team. This is our site, Thermo Fisher Scientific in Carlsbad, and this data was all compiled by the Synthetic Biology R&D team.

00:43:11 - 00:43:24, Slide 27

Thanks so much for listening and I’ll take your questions.

00:43:24 - 00:43:28, Slide 28

Thank you so much, Natasha, for a great presentation. We’ll now begin our question and answer portion of the webinar.

00:43:28 - 00:43:46, Question 1

Our first question is: Does Thermo offer a BFP or mutated GFP cell line?

We do not currently offer these as commercial products at this point. No.

00:43:46 - 00:44:48, Question 2

Okay. We have another question here is: Why was CD4 and OFP chosen over others?

We actually made, during product development, a variety of different types of reporters with different fluorescent proteins and different membrane protein expressions. We chose the OSP and CD4 because they worked quite well as well as additional internal business reasons for going with those in particular. OFP was a nice choice because it’s very bright and it’s compatible with most detection methods. You can see these lay under an EVOS, you can see it no problem with the 488 laser, and additionally, CD4, we had such a nice workflow already set up for that with our Dynabeads kit.

00:44:48 - 00:46:04, Question 3

Great, okay. Let’s see. We have another question here is: Have you tested this protocol in stem cell and how easy is transfecting them with CRISPR? If it’s hard, how can we make it easier? Can you elaborate a little?

Yes. Yes, I absolutely can. That was one of our biggest challenges easy on, early on as far as getting high efficiency in stem cells. We have now been able to get very high efficiency in stem cells using both electroporation and standard lipid methods. It is still quite difficult and you have that added element after the workflow that you need to prove that there’s still stem cells after the fact. They are more difficult to totally isolate, thus a big impedance to the workflow, and they’re not particularly easy to sort, but we had actually used all of our tools on stem cells. We get very, very good results using our Cas9 protein in stem cells with our IDP guide RNA. We also have very nice results with Cas9 mRNA using both electroporation or lipid-based methods.

00:46:04 - 00:46:33, Question 4

Okay, great. We have another question here as: Does this method work for detecting CRISPR mediated, mutated transgenic plant instead of plant cell?

I cannot speak specifically to the transgenic plants, specifically. So you’re referring to which method? The genomic cleavage detection kit?

00:46:33 - 00:47:06, Question 5

I’m assuming so, yes. They didn’t elaborate.

Yes. Yes, let me - I believe we have some plant data. If you’re interested in knowing more about that, I’m going to say I’m not the plant expert of the group, but I believe we do have some plant data and feel free to send me an email and I can see what I can find for you, more specifically, but yes, we have worked on plants, and actually done a lot with TALENs in plants with some of our collaborators.

00:47:06 - 00:48:21, Question 6

Great. Great information. Okay. Another question is: Is there a difference between hESC cells and iPSC cells transfection procedures?

Yes, there are. It’s very interesting because we see it vary greatly from target to target. Originally we actually saw that iPSC were actually quite easier to edit than hESC, but as we further investigated that and compiled more and more data, we definitely see that it really is target-specific. iPSC’s, there are a variety of other reasons that make them a little more easy to work with than hESC’s, but we do have our resident stem cell expert, so if you have any further questions or you’d like me to elaborate, again, feel free to send me an email and I can put you in contact and see what data we can share regarding hESC versus iPSC genome editing.

00:48:21 - 00:48:42, Question 7

Okay, great. We have another question here is: Transfected cells are quite a bit heterogenous. Any effort you’ve put into producing more homogenous engineered cell?

Yes. Well you mean as far as the nature of the indel produced?

00:48:42 - 0:49:51, Question 8

I’m assuming that’s what they’re asking. Yes.

Okay. I mean, yes – yes, we do. The number one thing is really improving the transfection efficiency and getting that as high as possible so we have the best population to work with prior to carrying out the clones and doing the deep sequencing and analysis. We are seeing that, again, its quite messy, but certain CRISPR’s tend to give more characteristics - or certain guide RNAs, I should say - tend to get more characteristic mutations. So, as far as getting a more homogenous cell population, one is optimize for transfection efficiency, get it as high as possible, and then often sort, because you’ll see some populations within these, within the transfected cells, sort from the most homogenous populations possible, and then when you get to sequence confirmation, it removes a little bit of that noise to make characterization and homogeneity a little bit better.

00:49:51 - 00:50:39, Question 9

Great and we have another question here is what software do you use for your flow analysis?

Honestly, we use internal Attune [NxT] software more often than not. Of course, sometimes we’ll use FlowJo; we have FlowJo and use that, but really the majority of our analysis, we have everything set up, all of our protocols stays, and we actually really find the most efficient, easiest way is to have our – use the Attune [NxT] software because it’s very quick, all of our plate maps are set up, we have the data very quickly that we want to present, keep in our notebooks, as well as all of our statistical data ready to go, and export from there.

00:50:39 - 00:52:07, Question 10

Okay, great. We have another question in. We have a lot of good questions coming, is how do I target multiple sites within a genome at the same time using the CRISPR system?

We actually have some really, really nice data for that and I believe it is all sharable than I can provide based on a paper that we published last year from our group. Yes. We see excellent efficiencies targeting multiple different areas in the genome by just co-transfecting with multiple guides and Cas9’s, and then assaying those different loci. So, in one example of the experiments we did, we did four guys, we co-transfected those together, we did a quick genomic cleavage detection assay on a single versus combined guys by amplifying each target loci individually, and then we went ahead and looked at the ones that contained all four of those mutations, and we’ve seen really nice multiplexing results. So, the short answer is yes, it’s very doable. Your efficiency, of course, is going to go down just statistically. If you have one giving 70%, maybe all four will be much smaller, maybe 30, 40%, but it can be quite good.

00:52:07 - 00:53:43, Question 11

Okay, great, and we have another question here is do you measure other contributing metrics to expressions such as cell health, signaling or function?

Yes, we sure do. This is a part of our research that we are really growing in and expanding right now ourselves. Of course, as I mentioned, in analyzing these clones, we use a lot of assays, a lot of it is pretty straightforward. We look at the proliferation rates, that’s one that I do a lot in validation of these stable cell lines that have been edited, so it’s using our CFSE Violet. I use a lot of the SYTOX LIVE/DEAD stain, as well as some of our Annexin and Caspase kits. Those are all related to viability, but then we also have, depending on the target that we’re knocking out, if we’re knocking out a receptor, we do have then other specific methods for looking at that. Now that we are making these libraries, we are moving much more into doing a lot more phenotypic assays. I hope to provide some nice application data for some of the things we’re working on there now. Those, of course, are a lot more specific because it depends on a specific target and choosing the correct antibody assay that using flow cytometry. We use Western and deep sequencing, really our qPCR, and a variety of different methods for that.

00:53:43 - 00:55:19, Question 12

Great. We have another question here is are there any design strategies you can use to reduce off-target effect?

Yes, absolutely. If you want to look at our CRISPR search and design tool, it’s very nice. This was created internally based on all the papers that were available based on what we know collectively on/off-target analysis and there are rules for running the best off-target. There are a lot of different rules that we see, and that we implement and that have been incorporated into this algorithm. So rather than explaining or going over each and every tool, which I do have some more information, though, you’re more than welcome to message me if you’d like more detail, but this tool that we’ve developed actually already has that. You can input your sequence and then it gives you a score, and that score is based on the off-target analysis. So, if you want to design four CRISPR’s for example, it’ll give you a list and it will rank those different CRISPRs on a percentage based on a variety of different parameters. One of the highest contributing factors is off-target. We also know some things about what makes a CRISPR a little more active and we’ve incorporated those as well, but my recommendation would be to definitely use our tool or another tool, there are a variety out there that can help improve that.

00:55:19 - 00:57:28, Question 13

Great and we have another, it’s a little more general question here. Researchers are having some problems to introduce specific point mutations in their cell lines, is there a method that would have good efficiency that you could recommend or maybe talk about your experiences?

I could talk a little more in detail about us trying to introduce these SNPs, so that’s part of the things that we use HR for. In the last slide that I talked about that was shown on the presentation looking at the GFP to BFP conversion, or the case of restoring the destructive GFP. Our recommendation with that is to use homologous recombination rather than relying on non-homologous end joining, and co-transfecting with donor DNA that’s going to result in a point mutation. Now, homologous recombination, traditionally is more difficult and we see much lower efficiencies than NATJ, but as you can see, that’s one of the things that we’re trying to work on as far as delivering the most optimal conditions and looking at every single aspect of that workflow to make it easier and increase the efficiency. As you can see, we’ve gotten – we’ve been able to increase it quite a bit. It’s always going to be, though, I mean we’re looking at a fluorescent reporter. Of course, the analysis and validation downstream does become more difficult when you’re looking at endogenous gene because you’re going to have to sequence those out at the end, but I always recommend optimizing with your cell lines, with your target if possible, optimizing all the conditions before moving to that step of actually moving forward.

00:57:28 - 00:57:56, Question 14

Great. We have another question here on – could you detect the level of a knockout? For instance, this researcher is working with a polyphenol oxidative transcripts and they mentioned that they can compare the degree of knockout to determine what’s the highest expression. Can you speak to that a little bit?

I’m not entirely sure if I understand the question. Are they talking about verifying the knockout using the expressions of the genes, using a downstream method?

00:57:56 - 00:5:51, Question 15

That’s what I’m assuming as well is just monitoring the level of how effective their knockout was. That’s how I would interpret their question.

Okay. So, anytime you have something like that in order to relatively easily assay the knockdown, I think that’s excellent. It’s not always that easy, but if you’re trying to get the best knockdown, what I would suggest if there are questions related to pertaining to experimental design, we’d try several different guide RNAs, and then perhaps even a pool of guide RNA and then downstream, measure the expression, or measure the knockdown at several different cases, and then find the best guide RNA or the best pooled conditions to maximize that knockdown efficiency.

00:5:51 - 00:59:43, Question 16

Great. Let’s see, we have another question here is which repair pathway would you recommend to repair CRISPR-mediated double-strand breaks?

Well, they both - they’ll both repair these breaks. Non-homologous end joining is much more efficient. So, if you’re looking for a knockout or a bio knockout, then I would definitely recommend, you could use NHEJ, if you’re more concerned with knockdown and getting knocked down in very, very high efficiency. Homologous recombination is generally, if you want something a little more specific or if you’re trying to very specifically add or delete a gene or correct a SNP, for example.

00:59:43 - 01:01:16, Question 17

Great and just one more, we have time for one more really quick question here is: I see that you offer a variety of platforms for CRISPR-Cas9 genome editing, what is your recommended platform for the highest editing efficiency?

Okay, so this definitely does depend on application, but the short answer is we see the highest efficiencies, lowest toxicities across the board using our CRISPR Platinum Cas9 nuclease, that’s the RNP platform combined with the in-vitro transcribed guide RNA. That would be recommended for transient transfection or genome editing, but we also have Lenti. Our Lenti libraries which are primarily used for screening, the Lenti stably integrates the guide RNA, so that’s going to get very high efficiency, and that you’re able to select from the guide RNA containing cells and select for that knockout over time, where the guide RNAs continuously pressing, which was, I think, shown in my third to last slides. So that produces very, very high knockout efficiency as well. You’d like to use Lenti, that’s generally used a little more for screening rather than transfection. There are other considerations where plasmid and mRNA are also very nice tools, but definitely the highest was the RNP-IVP combo.

01:01:16 - 01:01:36

Great. Unfortunately, that’s all the time we have today for the presentation. Any questions we have particularly - thank you everyone for sending in your questions. We got a lot of questions about cost and availability of products from Thermo. We’ll be able to follow up - our presenter will follow up in email response to you directly to answer your questions.

01:01:36 - 01:02:34

First of all, we’d like to thank Natasha Roark for sharing her knowledge with us today and giving a great presentation, and also offer a special thank you to Thermo Fisher Scientific for sponsoring today’s event. Please keep a lookout for an email containing a link to the on-demand version of this webinar. Thank you everyone for attending and have a great rest of your day.


Advancing immuno-oncology discovery and development with flow cytometry

In this webinar, Chris Langsdorf, scientist at Thermo Fisher Scientific, discusses how flow cytometry can effectively be employed in research efforts aimed at advancing the discovery and development of various immunotherapy strategies.


Cancer stem cells and mechanisms of multidrug resistance by flow cytometry

In this webinar, Dr. Jordi Petriz (The Jose Carreras Leukemia Research Institute in Barcelona) discusses how flow cytometry can be a key player in investigating cancer stem cells. The ability of normal and tumor stem cells to express transporter molecules that function to exclude anticancer drugs and certain dye molecules can potentially be exploited to identify and isolate cancer stem cells by flow cytometry.

 


Basics of Multicolor Flow Cytometry Panel Design

With the proliferation of new fluorescent dyes, as well as instruments that can detect 18 or more parameters multicolor flow cytometry has become more popular and more accessible than ever. This webinar will discuss the caveats of good panel design, including:

  • Rules for designing panels
  • Examples and practical application of these rules
  • Controls and standardization
  • Relevance of panel design to new mass cytometry platforms

View the recorded webinar


DNA Content Cell Cycle Analysis Using Flow Cytometry

Find out how careful acquisition and methodical preparation contribute to accurate and consistent DNA analysis. In this webinar we’ll discuss:

  • An overview of the methods and materials for using flow cytometry to determine cell cycle by measuring DNA content
  • Selection of DNA dyes for live cell and fixed cell analysis
  • Tips and tricks for consistent results

View the recorded webinar


Flow Cytometry in Microbiological Research

In recent years the application of flow cytometry in microbiological research has expanded from detection and quantification of organisms to more complex studies including analysis of host-microbe interactions and detailed spatial and temporal analysis of microbial metabolism in different environments. During this webinar we will discuss how the multi-parametric nature of flow cytometry can be applied to microbiology and the advantages of using this application over traditional microbiological methods.

View the recorded webinar


Basics of Flow Cytometry, Part I: Gating and Data Analysis

An introduction to fluorescence, this presentation covers the following topics: the principle of the flow cytometry platform, the basic components of a flow cytometer, how to interpret a dye excitation/emission spectrum, how data is displayed, basic gating demonstration, and common statistics and terminology used in flow cytometry.

View the recorded webinar


Basics of Flow Cytometry, Part II: Compensation

This presentation provides an overview of basic fluorochromes used in flow cytometry. Topic includes: the principle of compensation, how to perform compensation, the types of controls recommended and their use, basic strategies for designing a flow experiment, and data presentation.

View the recorded webinar


An Introduction to Flow Cytometric Analysis Using Molecular Probes Reagents,
Part I: Cell Proliferation Analysis

In this free webinar, we will discuss flow cytometric analysis of cell proliferation including CellTrace Violet and CellTrace CSFE, Click-iT EdU, dual pulse labeling with EdU and BrdU, Vybrant DyeCycle stains for live cell cycle, and FxCycle stains for fixed cell cycle analysis.

View the recorded webinar


An Introduction to Flow Cytometric Analysis Using Molecular Probes Reagents,
Part II: Viability, Vitality, & Apoptosis Analysis

In this free webinar, we will discuss flow cytometric analysis of apoptosis and identification of dead cells. Changes introduced by apoptosis can be tested with numerous assays measuring Membrane structure, Mitochondrial function, Metabolism, Caspase activity, Membrane integrity, and DNA fragmentation. We will also discuss dead cell identification using traditional impermeant nucleic acid dyes.

View the recorded webinar

Look for Molecular Probes on: