Interview with an iGEM member

Dr. Ashok Palaniappan, a Senior Assistant Professor at SASTRA Deemed University, was one of the two Principal Investigators for the first SASTRA Team to participate in the International Genetically Engineered Machine (iGEM) Competition held at Boston in 2019. Join us in learning his experience as he walks through his iGEM journey.

How did you come to know about this competition in the first place and what is “iGEM” about?

P: iGEM is a prestigious competition that promotes the application of the engineering principles to biological science. So, any reader in biotechnology would chance upon some reference to iGEM. I have known iGEM through publications and an earlier place I used to work at had a representation at iGEM. Also, many premiere institutes of India would participate in this competition. In fact, one of my professors in Anna University is a part of iGEM team of IIT Madras. So, I got to know about this competition by multiple avenues. In the year 2018 while I was taking a course called Systems Biology for Priyannth (who’s one of the team leaders of the iGEM team), we had a discussion on iGEM, and I motivated the formation of a team.

How would you describe “synthetic biology” to a non-biologist?

P: To give a broad definition, let me draw an analogy- just like how mathematics can be pure and applied, biology can also be viewed as having fundamental research and applied research, where the advancements can be applied for human welfare. I think the sort of applied research that harnesses biological principles to solve common problems of industrial and medical interest and that can be called as biotechnology and today it is also called as synthetic biology. Broadly, the application of engineering principles to biology and creation of new genetics would be called as synthetic biology. To give a more focused definition, synthetic biology involves the creation of modular genetic components that can combined in a circuit just like electrical components to create new designs for a novel purpose.

Can you explain the motivation and crux of your work?

P: We focused on cervical cancer as a broad area of interest for the iGEM project. Cervical cancer affects the better half of humans and is diagnosed with the pap smear which is an invasive test. In our project, we screened for biomarkers in the blood and identified miRNA biomarkers for early stages of cervical cancer. We then designed genetic circuits that light up in the presence of biomarkers, modeled and experimentally calibrated such designs.

The execution of such a work was a gargantuan task. Can you briefly mention about it?

P: During the execution, we formulated three titrations of the Design Build Test and Learn cycle (DBTL cycle) for engineering innovation.

In the 1st titration, we identified four miRNA biomarkers that are differentially expressed and performed survival curve analysis. We identified that one of them did not add to the signature and so came out with just three. We again tried dropping one of them and found that a set of two biomarkers gave the same significance as the 3 biomarkers. So that was the 1st cycle where we designed 4 biomarkers initially and then on performing the survival curve analysis and testing it, we learnt that 2 biomarkers are sufficient.

In the 2nd titration, we designed the genetic circuits that sensed these identified biomarkers. So, these genetic circuits have the Toehold sequences that could sense these biomarkers. So, this involved sequence construction of the Toehold sequences which had to be optimized for efficacy. This was the 2nd DBTL cycle.

In the 3rd titration, we did the reaction kinetics modeling of the designed circuits and the subsequent experimental characterization. These circuits come under the 2nd Generation Toeholds which involve the binding of anti-miRNA along with the biomarker. Thus, we had to model two separate substrates: the binding of anti-miRNA and biomarker and the binding of the subsequent complex to the Toehold. We arrived at a model with the amount of GFP expressed, the fluorescence intensity. We then evaluated the model with the experimental design for one of the biomarkers and found the circuit to be specific with the sensitivity largely overlapping that of the modeling analysis.

Although these competitions promote the aspect of product development do you feel this in turn compels one to choose less explored facets directing one to choose fields keeping in mind the business prospects involved with it? Do you see this aspect as a Boon or Bain? Did this influence the brain storming sessions?

P: I would not call having a product focus or working towards it a constraint or drawback. In most cases, an application-oriented focus keeps our ideas practical and grounded. Apart from that, iGEM offers a Foundational Advances track for teams that wish to work on ideas that are not immediately applicable. So, I think iGEM is a good testing ground to get applied ideas as well as foundational advances.

Your work comprises a technical and a non-technical part. What were all the specific technical skills that were required to carry out this project? Do, you feel the course curriculum proves to bolster these aspects? If not, do you have any suggestions for tweaking its?

P: I would say that no one student could have performed the major part of the iGEM work. I think each of the facets of iGEM really tested the student skills in software, modeling and experiments. So, to highlight the skills of ours in iGEM, we had a modeling component, which involved the design of genetic circuits as well as the analysis of performance of the circuits. Software phase involves the predictable modeling of efficacy of the Toehold which required the energy determination of free energy of the toehold-miRNA complex and other structural energetic features. Finally, a multiple regression model was optimized. The wet lab involves the experimental calibration which involves sensitivity, specificity with all cell-free transcription and translation kit.

So, the complexity in technical component is that a variety of skills were being tested. I would say our SCBT curriculum provided the right foundation for proving ourselves in iGEM. The areas we required more skills and had to run the extra mile were predictive modeling and design of the modular genetic circuits. A machine learning course can provide the foundation for predictive. In the experimental areas, working with the cell-free transcription and translation kit was novel to the students who found it to be very fascinating.

Being the first ever team to represent SASTRA in iGEM, what where the significant challenges you had to face?

P: Being the first ever team to represent SASTRA in iGEM brought many challenges, right from getting approval from institution for the go-ahead to facilitating the contingent to travel Boston. So, in the iGEM project there are multiple milestones including a terminal wiki (Wiki link: https://2019.igem.org/Team:SASTRA_Thanjavur) and a poster and presentation in Boston. Initial brainstorm sessions gave ideas that were fancy and impractical like usage of birds to identify genetic circuits to sense earthquakes. So, the brainstorming took a long time and then the brainstorming was mainly between three students and me. Forming a team was another great challenge as we knew that our project had several components and so we need a balanced team. We also had to maintain correspondence with the iGEM office and official sponsors to ensure that the iGEM kits and the orders reach us. The process was not as smooth as we expected as there were a lot of regulatory clearances associated with the DNA products. Personally, I faced challenges in keeping the team on its toes and maintaining the sense of working together. Towards the last three months, everyone worked very hard and there were a lot of long days which enabled us to run the extra mile and complete the work.

What do you think is the impetus (in the recent times) to the field of computational biology? What are some interesting areas/problem statements according to you?

P: We are at a juncture where robotization, automation and miniaturization are ever increasingly employed, and biotechnology is no exception. Experiments are planned to yield big data for downstream analyses today. So, in my opinion the fulcrum of scientific research is shifting to computational science increasingly. Computation is an integral part of any study. We have overcome the positive of data, with the challenge today being how to extract meaning from the data. This is particularly challenging in biology since there are numerous and complex hypotheses. Interesting problems include the computational modeling of the brain which is a grand challenge. Apart from that another challenge is how do we crunch data to actionable insights and testing of hypotheses by machine learning.

When it comes to experiments the “degree of freedom or variables” is definitely more; what according to you can be done to minimize these ‘stochastic errors’ associated with the experiments?

P: I wouldn’t call them stochastic errors. The key solution to this is effective experimental design. A well-designed experiment is going to have the best trade-off between the uncertainties in the experiment and what is expected from the experiment. The best way to design the experiment is by letting the computation to tell us the direction in which to go. If we had done the iGEM project without a modeling component, we might have used the 1st generation Toehold sequence and not 2nd generation Toehold sequence. And then we would have found experimentally that GFP was not getting translated because of the stop codon earlier. So upstream of the experiment, in the design phase we were able to identify that stop codons are induced and thus 1st generation Toeholds are not useful. So, we went ahead with the 2nd generation toeholds with the realization that the genetic circuits are well designed. Now all that was left was the optimization of the wavelength for the fluorescent measurement studies. Since we had a cell-free transcription and translation kit, where the components were all modularized the scope of error was highly minimized. So, all that remains are some measurement errors due to way of handling and environmental factors. In this way, experiments are optimized and helped by computation and modeling. This saves more time. Secondly, even modeling and not just experiments have a lot of variables. Intuition and analysis are useful in choosing the parameters and that is the key to effective modeling. It is equal parts of arts and science! We can say that,

Yesterday’s science Today’s technology Tomorrow’s medicine!!

Comments

Popular posts from this blog

Death by Insecurity

Letting Go - A different kind of Strength

A Remembrall for Elisa Lam