This guy (a renowned smallpox researcher) was able to synthesize a cousin of smallpox (horsepox) using commercially/publicly-accessible tools and resources. He did this to try and create a new/better smallpox vaccine, because he believes that motivated actors will be able to synthesize smallpox in the next 20 years, and that the world needs to be ready with better vaccines for the same. His team's resulting publication ("we made smallpox, this is the gist of how") was met with strongly negative responses.
The gist of these articles is that while synthesizing functioning viruses/microorganisms was possible in the past (TFA says the first "synthetic" organism was made in 2010), it's much easier/faster/cheaper to do so today, when crispr techniques/tools are more generally wielded and better-understood.
So the essence is "we are doing it first, becuase otherwise others will be doing it first"... So the race who will come first there continues... No (further) comment...
Why is this? If AGC is identical in function to TCG, and the same holds for the other replacements, shouldn't the new organism function exactly the same?
> Unfortunately, there's a big gap between what a DNA synthesis machine can output and the multi-million-base-long genome. The group had to do an entire assembly process, stitching together small pieces into a large segment in one cell and then bringing that into a different cell that had an overlapping large segment. "Personally, my biggest surprise was really how well the assembly process worked," Schmied said. "The success rate at each stage was very high, meaning that we could do the majority of the work with standard bench techniques."
> During the process, there were a couple of spots where the synthetic genome ended up with problems—in at least one case, this was where two essential genes overlapped. But the researchers were able to tweak their version to get around the problems that they identified. The final genome also had a handful of errors that popped up during the assembly process, but none of these altered the three base codes that were targeted.
So it sounds to me like the process wasn't quite perfect. They also note that DNA "redundancy can also allow fine-tuning of gene activity, as some codes are translated into proteins more efficiently than others".
These effects are often minor but this bacteria has undergone hundreds of millions of years of optimization via natural selection and some researchers have come along and disrupted 18,000 sites. Probably the slower growth and length abnormalities are just that the bacteria is a little miscalibrated and displaying minor symptoms of malaise.
Edit: this the frequency of that codon being used o the genome sequence. But the assumption could be that it is also preferably produced as a t-RNA so it can replicate its genome efficiently.
It's more likely that they have introduced errors, or non-preferred patterns, which affect the number of replication forks and stall the replication machinery.
The tools used by this particular study:
I wouldn’t be surprised if the academic team in this case developed something relatively simple. Let’s try to find out how they dispatched the synthesis orders they made over the two years.
Was it excel or csv at the end of the day?
If they were an enterprise biotech I would bet the would have a much more elaborate in-house data & design toolchain + Lims. But academic teams rarely have the resources or drive necessary to engineer digital tools approaching that level of sophistication.
Other software academics might use
- maaaaybe antha-lang
Alternatively, perhaps the synthesis vendor supplied their own optimized design and inventory tool. Who did they buy from - gen9?