Sadhbh O’Neill
In attendance as member of EPA advisory committee on GMOs, representing the Environmental Pillar.
This symposium is an event to facilitate dialogue between the scientific experts across the EU and EEA on current trends in regulating the contained use and deliberate release of GMOs.
This document is merely intended to be a personal note of a conference and does represent an opinion on biotechnology or the views of the Environmental Pillar. I have organised the notes thematically rather than as the papers were presented, and have tried to draw out the most important issues as I understood them. However this may not be a 100% accurate account of the papers given and for this reason I have deliberately omitted the names of the authors.
1. New technologies and risk assessment
The first session focused on new technologies such as CRISPA/Cas9 which allow for ‘genome editing’ which may not leave traceable or detectable changes to the genome. The technique allows for the removal, replacement or insertion of DNA/RNA using advanced ‘molecular scissors’ or ‘genetic microsurgery’. The techniques have a number of advantages and applications, and seem to allow for a higher degree of specificity and accuracy in developing transgenic products, but which are much faster to develop and cost effective than previous approaches. Applications include:
– Reverse genetics (knocking out gene expression)
– Synthetic biology
– Animal and cell-based models of human diseases
– Gene therapy (gene correction)
Safety issues include:
– Minimising off-target events
– Avoiding immunogenicity (causing an immune response)
– Minimising delivery size (of rDNA)
Scientists working with this technology report that while there is a trade-off between activity and specificity (how to get maximum impact from minimum genetic modification). They report that the technology is both versatile and scalable, with low toxicity effects. It seems to have lots of potential for applications in agricultural and medicinal uses.
This technology is effectively advancing at a faster pace than the regulations, and it begs many questions about the efficacy of the current EU regulatory regime. For instance, a move towards product-based risk assessment could possibly permit some new crop breeding techniques to by-pass current regulatory requirements. All of these considerations make risk assessment a more complex task, especially as in some cases, notifications for market approval or releases may involve a combination of technologies or New Plant Breeding Techniques (NPBTs).
In devising appropriate risk assessment techniques, competent authorities should address specifically those aspects of the application that are ‘risk relevant’, for instance, are the new traits stable? Has there been stable integration of the new genetic material, where relevant? It is worth noting that in Canada the regulatory focus of risk assessment is not on genetic modification per se, but on the ‘novelty and riskiness’ of new plant products, which covers potentially all NPBTs. Because of the nature of the new techniques, it is becoming more and more difficult to define a GMO, again begging the question of whether the focus should be on assessing the method or the product for its potential risks to human health and the environment.
2. Ecological impacts of GMOs
A number of papers were presented outlining research and research methodologies used to assess the environmental effects of GMOs. One such example was a large study in the Netherlands conducted to devise a baseline of a Normal Operating Range (NOR) of a soil ecosystem so that any changes could be measured using conventional statistical techniques. Noting the importance of soil ecosystems to overall ecosystem health and diversity, it is especially important to be able to devise a technique to measure impacts on non-target organisms (NTOs) and biogeochemical processes, such as fungi, invertebrates, disease antagonists, soil microbial communities, as these all contribute to nutrient provision in plants and soil fertility, health and structure. GMOs could be said to be having adverse effects if they move a soil community out of the normal variation, but first it is essential to have a concept of a baseline that includes natural variation.
Key issues following from the Dutch study: it seems as though for each crop and each site, such studies would have to be carried out (and not just on soils), measuring as many variables as possible, all of which is time-consuming and expensive. Another ongoing research project in Spain is attempting to optimize field trials to minimise the cost and number of indicators required to ascertain adverse effects on NTOs. Questions arise regarding the statistical robustness of the field trial methodologies if such variables are reduced, however the key lesson is that it is not necessary to test non-representative taxa.
Another paper looked at the question of the relative mutagenicity of GM crops compared with non GM varieties. It found that while GM varieties do have higher rates of mutagenicity, set against background variation these mutations are not statistically significant. The presenter suggested there was no real need to report individual mutations which would require the submission of whole-genome data. It was suggested that risk assessors should request the sequence for the inserted T-DNA and flanking genes but not the whole genome or for individual mutations.
A presentation was given on a proposed notification (which was subsequently withdrawn) for a GM olive fly in Spain by a company called Oxitec. The olive fly is responsible for a lot of crop damage (Spain produces over 50% of the world’s olive oil) and is becoming resistant to insecticides and other biological controls (some of which are being phased out of the EU anyway). Oxitec developed a GM variety which has the characteristics of conditional lethality (all females die after mating) and has a non-toxic fluorescent marker.
The interesting thing about this application is that following requests for further information and research, what was intended to be a ‘deliberate release’ now has more of the features of a ‘contained use’ as the trial is now set up inside air-locked tents, with various provisions for preventing the release or ensuring the destruction of the GM flies if they do escape. One has to wonder if the trial as it is currently set up will help resolve the question of impacts on NTOs or if it will simply confirm the efficacy of the GM variety. So wherever there are questions of risk to be addressed, the research must be given the scope to ‘ask’ and ‘answer’ the questions posed.
Another study analysed all the current research on the potential effects of the use of Antibiotic Resistant Marker Genes (ARMGs). The latest guidance from the EU (Regulations 503/2013) discourage the use of ARMGs, but there is no scientific evidence of any danger arising from their use. The study looked at alternatives to the use of ARMGs by research agencies and firms, and queried what underlying considerations dictated the choice of ARMGs or their alternatives. Interestingly public opinion and regulatory pressure lay behind some decisions to use alternatives, whilst those companies and research bodies that continued to use ARMGs justified their choice on the basis of their safety, ease of use and cost.
Another paper looked at the issue of the exposure of aquatic organisms to pollen from Bt-maize. The research project devised a robust methodology for analysing the quantity of pollen that would end up in water systems, but no adverse effects were reported and the researchers concurred with existing research that the Bt toxin becomes inactive in soil sediments and does not appear to pose any risk to aquatic organisms. However they did not explain how the Bt toxin becomes inactivated.
3. Use of GM in virology and gene therapy
One of the most controversial issues in biotechnology is the use of GM technologies in studying viruses particularly those capable of becoming pandemic or zoonotic (any disease of animals communicable to humans). Whilst study of viruses in laboratory settings is not new now it is possible to easily sequence the genomes (viruses have relatively short, simple genomes compared to plants and mammals) and generate mutations using various new technologies to learn more about how a virus evolves and adapts. A key question is what number of mutations it takes a virus to become airborne transmissible (without which it does not pose a major pandemic risk to humans). Current research suggests that it takes 5 or 6 mutations for a flu virus to gain airborne transmissibility, and in the case of H7N9 (I think) 3 of these mutations have already occurred. In addition once airborne the virus appears to be more stable, and with a 25% death rate a serious global pandemic could cause millions of deaths. However with good protocols viruses can be ‘stopped’ such as happened with SARS.
A laboratory in the Netherlands has been working on this and has come under fire mostly from US politicians who are (mistakenly) associating the risks of these procedures with potential for bioterrorism or accidents due to the failure of safety protocols. In the US ‘gain of function’ research may not be publicly funded for this reason, however the EU still backs the Dutch research. Understanding how viruses gain the function of transmissibility could be key to predicting pandemics and developing viruses and following all the correct protocols do not pose any risks to the public as research projects.
A paper from a British virologist highlighted the need for a broader regulatory conception of ‘benefit’ and a better understanding of relative risk. He noted that vaccines are always slow to roll out and rarely meet the needs of people affected by the first wave of a pandemic. It is difficult to describe the benefits of this type of basic research on an ‘a priori’ basis – it’s always easier to justify the research ‘a posteriori’. We should be open to the use of new technologies to gain better understanding of the behaviour of viruses and to be able to design effective vaccines more quickly.
In the case of gene therapy there appears to be many advances which will shortly be appearing as approved treatments. With the advances in sequencing technology, it is possible to broaden the therapeutic indications under which gene therapies might be considered. It has been noted for decades that some viruses e.g. measles attack cancer and tumour cells effectively but the challenge has been to design a delivery pathway for the virus that is specific to the tumour site and is safe in all other respects. Key challenges are efficiency (gene transfer, stable integration and gene dose regulation) and safety (immunogenicity, mutagenesis, germline transmission). With the CRISPA/Cas9 techniques now available it is possible to design better models of human diseases, although it seems likely that animals will be both tested and treated first.
4. Public attitudes and evidence-based policy
Whilst all of the information on current biotechnology is widely available in published form, and in the dossiers supplied to regulators as part of market approval applications, even publicly funded scientific research usually takes place in research institutes, laboratories and without inputs from ‘stakeholders’. Some of the participants I met at the conference were also involved in running field trials and had experienced vandalism and the high costs associated with protecting sites from interference (in one case the fencing around 25 Ha alone cost €150,000). The scientists working in this field tend to just get back to work after a set-back or ‘incident’ and have ‘given up’ to a degree trying to explain what they are doing to a cohort of the population that is opposed to GM technologies for a host of reasons including safety concerns, but also anti-globalisation concerns, concerns about costs (to local food production systems) and benefits (to multinationals).
One very large EU funded project in Germany however made stakeholder consultation a key part of its work programme and has dedicated funds for this purpose, as well as a commitment to complete transparency of all the data. Curiously this project was a feeding trial for a Bt-maize variety (MON 810) and while the EU call specifically sought this aspect of the study it is not scientifically obvious why it was carried out apart from considering it a response to public and political pressure. The EFSA require a 90 day feeding trial if triggered by a high degree of uncertainty.
The effect of opening up a debate in this way has possibly helped prevent vandalism, although the feeding trial attracted animal rights activists as well as anti-GMO protesters. The scientists found the project criticised by all sides but have remained committed to the process of stakeholder inclusion and the publication of all data in an open access journal. However there are still some limitations to transparency as access to non-CBI protected data can be available from the EFSA in a ‘read only’ format.
Although no adverse effects were reported from this trial, some questions about the methodology of feeding trials were posed: for instance, can the trial detect the toxin in the rats? Shouldn’t there be a comparator as well as control group where another group of rats were fed some other kind of ‘toxic’ or ‘bad’ food?
The final paper of the symposium was given by a lead member of a biosafety sub-committee on ethics and societal issues. He described a conventional trajectory for a ‘public’ debate on some scientific controversy – it could be anything from mobile phone masts to vaccination to fracking or biotechnology. Typically the scientific consensus is represented by an organised advisory structure which is then challenged by a legitimate counter-discourse. Scientific and policy authorities respond by producing more reports and studies but miss the point that societal debate is fuelled by broader concerns over the technology at stake, for example, who benefits financially? Will this technology promote sustainability?
Interestingly the key issue is not public trust in science – which is high, and higher than trust in politicians – but a lack of trust where science is related to tainted policies or where it seems to be delivering up ‘serviceable truths’. The public does want science to support societal goals, improved health and well-being. However if the ‘underlying debate’ is not allowed to surface then the disputes only continue and positions become entrenched. The recommendations were:
– Don’t narrow policy issues to scientific questions
– Organise a broader socio-political debate
– Build a common research agenda
– Create transparency over scientific uncertainty
Sadhbh O Neill
17th October 2014